+ All Categories
Home > Documents > umi-umd-2199.pdf

umi-umd-2199.pdf

Date post: 04-Apr-2018
Category:
Upload: hostalacosta
View: 215 times
Download: 0 times
Share this document with a friend
282
ABSTRACT Title of Dissertation: PRECONSCIOUS INFLUENCES ON DECISION MAKING ABOUT COMPLEX QUESTIONS Deep Singh Sran, Doctor of Philosophy, 2005 Dissertat ion dir ec te d by: Pr of es so r Patrici a Ale xa nder  College of Education Department of Human Development There is evidence that the most widely accepted theories and models of judgment, decision making and reasoning are inadequate because they do not accurately describe what people do or are able to do when making decisions. One shortcoming of existing theories and models may be that they do not account for the potential influence of  preconscious processes on decision making and conscious reasoning. The present study investigated whether preconscious processes influenced decision making about complex questions based on interviews with 41 state legislators and 18 doctoral students. This inquiry also examined whether participants’ decision making processes differed by issue and whethe r legislators and doctoral students differed in how th ey made policy decision s. Participants were asked to make two educational policy decisions and were asked follow-up questions about each decision. These follow-up questions were designed to collect data con cerning the source and quality of participants’ evidence, their ability to
Transcript

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 1/282

ABSTRACT

Title of Dissertation: PRECONSCIOUS INFLUENCES ON DECISION

MAKING ABOUT COMPLEX QUESTIONS

Deep Singh Sran, Doctor of Philosophy, 2005

Dissertation directed by: Professor Patricia Alexander 

College of EducationDepartment of Human Development

There is evidence that the most widely accepted theories and models of judgment,

decision making and reasoning are inadequate because they do not accurately describe

what people do or are able to do when making decisions. One shortcoming of existing

theories and models may be that they do not account for the potential influence of 

 preconscious processes on decision making and conscious reasoning.

The present study investigated whether preconscious processes influenced

decision making about complex questions based on interviews with 41 state legislators

and 18 doctoral students. This inquiry also examined whether participants’ decision

making processes differed by issue and whether legislators and doctoral students differed

in how they made policy decisions.

Participants were asked to make two educational policy decisions and were asked

follow-up questions about each decision. These follow-up questions were designed to

collect data concerning the source and quality of participants’ evidence, their ability to

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 2/282

generate counterarguments, their certainty in the accuracy of their decisions, whether the

 policy questions evoked an affective response, and how much participants reported

knowing about each decision topic. The study also measured and compared how quickly

 participants made decisions and provided reasons to support their decisions. To complete

the interview, participants were asked to review two decision-making models, a

traditional purely-conscious model and a second intuitive model that incorporated

 preconscious processes, and to select the model that better described how most people

and how the participants themselves made political decisions.

Based on the data collected there is reason to believe that preconscious processes

may influence decisions about policy and other complex questions. Participants made

decisions quickly, with little external evidence to support the decisions. They were quite

certain about the accuracy of their decisions even though many reported having little or 

know knowledge about the decision questions. Participants’ comments also suggested

that one or both decision topics evoked an affective response to the policy question. And

most participants described their own decision making using the decision model that

depicted the influence of preconscious processes. These findings do not support the

accuracy of traditional, purely conscious models of judgment and decision making.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 3/282

PRECONSCIOUS INFLUENCES ON DECISION MAKING

ABOUT COMPLEX QUESTIONS

 by

Deep Singh Sran

Dissertation submitted to the Faculty of the Graduate School of theUniversity of Maryland, College Park in partial fulfillment

of the requirements for the degree of 

Doctor of Philosophy

2005

Advisory Committee:

Professor Patricia Alexander, Chair 

Professor Roger Azevedo

Dr. Ann BattleProfessor James Byrnes

Professor Bruce VanSledright

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 4/282

© Copyright by

Deep Singh Sran

2005

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 5/282

ii

TABLE OF CONTENTS

List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi

List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii

Chapter I: Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

Research on Decision Making . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

Research on Preconscious Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

Investigating the Decision-Making Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7Preconscious Influences on Decision Making . . . . . . . . . . . . . . . . . . . . . . 7

Comparing Decision-Making Processes for Two Topics . . . . . . . . . . . . 10

Comparing Decision-Making Processes of Two Groups . . . . . . . . . . . . . 11Statement of the Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

Data Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

Research Questions and Hypotheses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

Chapter II: Review of Relevant Literature . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20Literature Selection Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Decision Making and Preconscious Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

Traditional Model of Reasoning and Decision Making . . . . . . . . . . . . . . 25Preconscious Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

The Absence of Introspective Awareness

and a Reliance on a priori Causal Theories . . . . . . . . . . . . . . . . . 35Affect Independence and Affect Primacy . . . . . . . . . . . . . . . . . . 42

Automatic Evaluation Effect . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

The Social Intuition Model of MoralJudgment and Moral Reasoning . . . . . . . . . . . . . . . . . . . . . . . . . . 51

Information Processing May Not Be

Motivated by a Search for Accuracy . . . . . . . . . . . . . . . . . . . . . . 55

Theories about the Interaction BetweenEmotion and Reason . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

Affect as a Substitute for Conscious

Reasoning in Risk Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 Non-Consequential Decision Making . . . . . . . . . . . . . . . . . . . . . 66

Reason-based Analyses of Choice and Why Reasons Are

So Important . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67Kuhn’s Study of Argument Skills . . . . . . . . . . . . . . . . . . . . . . . . 68

Causal Theories and Policy Decisions . . . . . . . . . . . . . . . . . . . . . . . . . . . 70

Why Study Policy Decisions Instead of Causal

Theories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71Selecting Decision Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . 72

Political Decision Making . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

Political Ignorance and the Construction of Preferences (and

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 6/282

iii

Decisions) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

Theories of Political Decision Making and PreconsciousProcesses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76

Affective Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77Symbolic Politics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80Heuristic and Online Models of Political

Decision Making . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81

Critique of Research on Preconscious Influences . . . . . . . . . . . . . . . . . . 86

Intuitive Decision Making and Reasoning Model . . . . . . . . . . . . . . . . . . . . . . . . 87Knowledge, Experience, and Expertise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90

Chapter III: Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97Pilot Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99

Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100

Materials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101

Outcomes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101

Final Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102

Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102Materials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105

Decision Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105

Interview Protocol . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107

Response Time: Decision Latency, Analysis Time,

Counterargument Latency, and Partisan Latency . . . . . . . . . . . . 108Justifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116

Citing Evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118

Justificatory Rationale . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120Counterarguments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121

Certainty . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123

Expert Knowledge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123

Self-Assessed Knowledge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124Affect . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124

Reported Speed to Decision . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125

Argument Repertoire . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125Choice of Decision Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126

Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126

Measuring Response Times . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127Interrater Agreement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130

Chapter IV: Results and Discussion Concerning Preconscious Influences on

Decision Making . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132Response Times and Reported Speed to Decision . . . . . . . . . . . . . . . . . . . . . . . 134

Levels of Certainty, Self-Assessed Knowledge, and Affective Response . . . . . 142

External Evidence and Rationales . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 7/282

iv

Choice of Decision Model to Describe Decision-Making Processes . . . . . . . . . 153

Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155

Chapter V: Comparison of Legislators’ and Graduate Students’ Decisions andResponses for Two Decision Topics . . . . . . . . . . . . . . . . . . . . . . . . . . . 160Comparative Analyses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160

Comparison of Decision-Making Processes for Two Decisions . . . . . . 161

Participants’ Decisions about Class Size Limits and

Privatization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161Ideological Explanations Offered in Support of Decisions . . . . 162

Participants’ Appraisals of the Partisan Characteristics of 

Legislative Proposals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165Comparing Response Times for the Two Decisions . . . . . . . . . 165

Self-Assessed Knowledge and Certainty for the

Two Decision Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166Participants’ Comments about Decision-Specific

Decision-Making Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167

Differences in How Legislators and Graduate Students

Made Policy Decisions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168Comparing Legislators’ and Graduate Students’

Response Times . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169

Comparing Evidence and Rationale for Legislatorsand Graduate Students . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172

Legislators’ and Graduate Students’ Comments about

the Decision Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173Legislators’ and Graduate Students’ Certainty

about Their Decisions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173

Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176Differences in Decision Making about Class Size Limits

and Privatization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176

Differences in How Legislators and Doctoral Students Made

Decisions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181

Chapter VI: Participants’ Selection and Discussion of Decision Models . . . . . . . . . 184

Participants’ Responses about Decision Models . . . . . . . . . . . . . . . . . . . . . . . . 184Participants’ Decision Making Was subject to Preconscious Influences 186

Participants’ Reasons Were Constructed while Responding . . . . . . . . . 202

Decision Models May Be Decision-Specific . . . . . . . . . . . . . . . . . . . . . . . . . . . 207Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212

Chapter VII: Summary, Conclusions, and Implications . . . . . . . . . . . . . . . . . . . . . . . 216

Summary and Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 8/282

v

Implications for Practice and Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224

Implications for Practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224Implications for Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226

Appendix A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228Appendix B . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231

Appendix C . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234

Appendix D . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237

Appendix E . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 9/282

vi

LIST OF TABLES

1. Descriptions of Variables, Coding Details, Data Analyses, and

Identification of Variables Influenced by Kuhn (1991) 109

2. Legislator (Leg.) and Graduate Student (Grad.) Data for 

Quantitative Variables 135

A1. Number and Percentage of Legislators and Graduate StudentsDeciding to Oppose or Support Legislative Proposals to Limit

Class Size or to Privatize Public Schools 237

A2. Number and Percentage of Legislators Citing External Evidence,

Personal Evidence and Nonevidence in Response to Interview

Question 1 and Subsequent Probe for Detailed Information 238

A3. Number and Percentage of Graduate Students Citing External Evidence,

Personal Evidence and Nonevidence in Response to Interview Question 1

and Subsequent Probe for Detailed Information 239

A4. Number and Percentage of Legislators and Graduate Students Offering

Specific Types External and Personal Justificatory Rationale in Supportof their Policy Decisions 241

A5. Number and Percentage of Legislators and Graduate StudentsGenerating Counterarguments 242

A6. Number and Percentage of Legislators and Graduate StudentsCharacterizing Class Size and Privatization Proposals as Liberal

or Conservative Positions 243

A7. Number and Percentage of Legislators and Graduate Students SelectingTraditional Model, IDMR Model or Both to Describe How Most People

Make Political Decisions 244

A8. Number and Percentage of Legislators and Graduate Students Selecting

Traditional Model, IDMR Model or Both to Describe How They

Themselves Make Political Decisions 244

A9. Individual Legislator’s and Graduate Student’s Data for Class Size and

Privatization Decisions 245

A10. Chi-Square Analyses of Certain Frequency Data 254

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 10/282

vii

LIST OF FIGURES

1. Traditional Reasoning and Decision Making Model 27

2. Intuitive Decision Making and Reasoning Model 29

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 11/282

1

CHAPTER I

INTRODUCTION

The study of decision making concerns how people make choices, why people

make one choice and not another, how and why the decision-making process differs for 

different people and different decisions, how to model individual and group decisions,

and how to predict future decisions (Baron, 2001; Kahneman & Tversky, 2000; Kuhn,

1991; Marcus, Neuman, & Mackuen, 2000; Stanovich & West, 2000; Voss, Perkins, &

Segal, 1991). There is evidence that the most widely accepted theories and models of 

decision making are inadequate because they do not accurately describe what people do

or are able to do when making decisions (Baron, 2000; Cherniak, 1986; Evans & Over,

1996; Green & Shapiro, 1994; Kahneman, Slovic, & Tversky, 1982). Research in social

 psychology and cognitive science suggest that one shortcoming of existing theories and

models of decision making is that they do not account for the potential influence of 

 preconscious processes on decision making and conscious reasoning (Chaiken & Trope,

1999; Damasio, 1994; Denes-Raj & Epstein, 1994; Gilbert, Tafarodi, & Malone, 1993;

Haidt, 2001; Murphy & Zajonc, 1993).

The present study investigated whether preconscious processes influence decision

making about complex policy questions. This study examined political decision making,

instead of general decision making about other important and complex subjects, for at

least three reasons. First, almost all adults in the United States are likely to encounter and

are entitled to make political decisions that shape public policy. Second, political

decision making is itself a worthy subject to study given the impact of such decisions on

almost every aspect of our lives, our society, and our world. Finally, in terms of 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 12/282

2

importance and complexity, political decisions share many features with other important

and complex decisions in our lives. In other words, examining political decision making

is a way to also analyze decision making about important and complex questions more

generally (Lupia, McCubbins, & Popkin, 2000a).

Research on Decision Making

This study is motivated by an interest in how and how well schools prepare

children to be active participants in a democratic society and in what formal education

can do to improve the soundness and reasonableness of citizens’ political decisions. A

democratic society needs informed citizens to shape public policy (Fishkin & Laslett,

2003; Lupia, McCubbins, & Popkin, 2000b; Madison, Hamilton, & Jay, 1788). This

study is based on the author’s earlier examination of efforts to improve students’ ability

to think critically. It can be argued that thinking critically is another way of saying

making important decisions well (Paul, 1993; Siegel, 1997).

Based on the author’s research on critical thinking and decision making, there

appeared to be an interesting disconnect between the critical thinking and judgment and

decision making literatures on the one hand (e.g., Beyer, 1985; Ennis, 1991; Halpern,

1998; Kahneman et al., 1982; Kahneman & Tversky, 2000; Lipman, 1995; McCarthy,

1996; Siegel, 1997), and the social psychology and cognitive science literatures on the

other (e.g., Bargh & Chartrand, 1999; Chaiken & Trope, 1999; Epstein, 1990; Gilbert et

al., 1993; Higgins & Kruglanski, 1996; Zajonc, 1980). The literature on judgment and

decision making invariably treated decisions as products of conscious reasoning, without

investigating the possibility that decisions may not in all cases be the product of 

conscious reasoning (e.g., Baron, 2000; Ghirardato, 2001; Katzner, 1989; Kelsey, 1994;

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 13/282

3

Kelsey & Quiggin, 1992; Kravchuk, 1989; Nehring, 2000). It is simply assumed that

decisions are caused by and in all cases follow some amount of reasoning about

consciously-available information (Lupia et al., 2000a). Accordingly, traditional models

of decision making or choice (discussed in greater detail in Chapter II) posit that we first

reason about our alternatives and only then do we select the alternative (i.e., the decision)

that has the highest utility for us (Baron, 2000). Similarly, widely accepted theories of 

 political decision making assume that political decisions are the result of conscious

 processes alone (Green & Shapiro, 1994).

It is important to emphasize that the view that decision making is a purely

conscious process, a view that is dominant in political science and economics, is not the

only one. Recent work in social psychology, cognitive science, and decision research

includes a consideration of preconscious influences on choices or reasoning (Haidt, 2001;

Lupia et al., 2000b; Marcus et al., 2000; Stanovich & West, 2000; Slovic, Finucane,

Peters, & MacGregor, 2002). Specifically, theories and empirical findings from social

 psychology and cognitive science suggest that decisions and conscious reasoning about

the decision task might, at least initially, be the product of separate and sometimes

divergent processes (Damasio, 1994; Epstein, 1990; Zajonc, 1980). Dual-process theories

in social psychology (Chaiken & Trope, 1999), research on affect primacy (Zajonc,

1980), the automatic evaluation effect (Bargh, Chaiken, Raymond, & Hymes, 1996), and

moral judgment (Haidt, 2001), and findings from cognitive science (Calvin, 1996;

Damasio, 1994; Damasio, 1999; Dennett, 1991; Edelman & Tononi, 2000), for example,

challenge the entirely conscious model of decision making that dominates the literature

on critical thinking and political judgment and decision making. Nevertheless, a review

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 14/282

4

of the literature on judgment and decision making, particularly in the domains of political

science and economics, reveals little evidence that decision theorists or researchers were

aware of theories and findings outside of their disciplines suggesting that decision

making and reasoning might be separate processes and that preconscious processes might

influence decision making and reasoning.

At the same time, there was scant evidence in the literature that theorists and

researchers in social psychology and cognitive science appreciated the implications of 

their work for choice or decision research. It was as though findings from social

 psychology and cognitive science did not exist or were not relevant to the study of 

 political decision making or reasoning. Fortunately, in the last several years, there has

 been a growing awareness that these findings are highly significant to research on

 judgment and decision making (Haidt, 2001; Lupia et al., 2000b; Marcus et al., 2000;

Stanovich & West, 2000; Slovic et al., 2002). Still, there was no research that addressed

the question of whether preconscious processes influence decision making about complex

 policy questions and that explored this question by interviewing study participants about

their decision making in response to complex questions. The present study addressed this

gap in the decision literature by investigating whether preconscious processes influenced

decision making about complex questions of public policy.

Research on Preconscious Processes

This study was based on certain findings from social psychology and cognitive

science concerning preconscious processes and it extended those findings in this inquiry

of political decision making, an area that has only recently been influenced by these

findings (Lupia et al., 2000b; Marcus et al., 2000). In particular, this study is an extension

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 15/282

5

of the following findings to the investigation of decision making about complex policy

questions:

• People have little or no awareness of or access to their cognitive processes, so

they are not aware of the actual reasons for their decisions or actions even though

they can produce reasons when prompted to do so (Nisbett & Wilson, 1977).

• Our affective response to an object in the environment is automatic and it

 precedes our conscious response, suggesting that there may be separate affective

and conscious cognitive systems (Murphy & Zajonc, 1993; Zajonc, 1980).

• Based on their affective response, people automatically evaluate every attitude

object (i.e., word) they encounter, before consciously thinking about it (Bargh et

al., 1996; Gilbert et al., 1993).

• Moral judgment precedes moral reasoning, with post hoc reasoning providing

reasons for the initial moral judgment rather than causing the initial judgment

(Haidt, 2001).

• Decisions and actions are the products of two parallel and interactive information

 processing systems, a preconscious affective system and a conscious rational

system, with the preconscious system dominating most everyday decisions

(Epstein & Pacini, 1999).

• Decision making with respect to personal and social matters is not possible, or at

least severely compromised, without the benefit of emotional signals, or somatic

markers, that narrow the range of possible decision options (Damasio, 1994).

• Most people do not provide sound evidence to support their causal theories about

social phenomena, nevertheless they are as certain of the accuracy of their 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 16/282

6

theories as those who do. Further, people have great difficulty distinguishing their 

theories from evidence so that they are unlikely to evaluate the quality of 

evidence when assimilating it into their existing theories and are therefore likely

to have low epistemological sophistication (Kuhn, 1991).

• Affect may operate as a substitute or “heuristic” for reasoning about the decision

task in those cases where decision-specific information is not available or where

the decision topic is emotionally salient (e.g., Slovic et al., 2002).

Based on these theories and findings this study investigated whether existing decision

models were incomplete because they did not account for the influence of preconscious

 processes.

Although the theories and studies cited above and again in Chapter II suggest that

decisions may be made preconsciously or automatically, at least in some instances, the

issue of whether decision making is a preconscious or conscious process is open to

debate. This study does not seek to resolve the debate. Instead, this study seeks only to

explore the implications of the cited research for analyses of political decision making,

an area that has not been influenced in a meaningful way by the cited works. One

consequence of investigating preconscious influences on decision making is to

investigate how the self (i.e., the background and characteristics of each decision maker)

shapes decision making and reasoning. To date, most decision models and decision

research in political science have neglected the self system and how it may influence the

decision making process.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 17/282

7

Investigating the Decision-Making Process

The present study investigated three research questions. The first concerns

whether preconscious processes influence decision making about complex policy

questions. The second concerns how decisions about different topics compare when one

topic is less familiar than the other. The third question examines how two groups of 

decision makers compare when making identical decisions. A principal objective of the

study was to investigate how policymakers make political decisions, so the sample

included state legislators. Choosing this sample had important consequences for the

design of the study and the data collected, since the questions and procedures that would

 be suitable for this population would differ for populations that have been studied in the

 past (e.g., college students). For example, when measuring legislators’ response times to

decision questions, it was not appropriate to ask them to press a button each time they

made a decision to mechanically record decision latency. These special circumstances are

further elaborated in Chapter 3.

Preconscious Influences on Decision Making

If decision making and reasoning are separate processes, at least in the earliest

stages of decision making about complex questions, and if preconscious processes

influence decision making, evidence of this should be available in at least three forms.

First, this study examined the relation between how quickly participants made decisions

about complex questions and how much time they spent reasoning about the questions.

The traditional view that people think about the decision task and only then make a

decision requires that an individual reason first and then decide (Baron, 2000; Lupia et

al., 2000a). Evidence that making a decision takes less time than providing the reasons

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 18/282

8

for that decision is evidence that decision making and reasoning may occur separately.

After all, if the traditional model is correct, making a decision should take longer than

reasoning about the decision task, because the time it takes to make a decision must

include the time it takes to reason about the decision task.

Second, this study investigated participants’ certainty in their decisions and the

information they reported about the decision topic. If a participant’s certainty about a

decision is not positively correlated with how much that person knows about the decision

topic, this suggests that certainty is not the product of participants’ conscious assessment

of the state of their knowledge about the decision topic. Instead certainty may be an

affective signal or feeling about whether one knows enough to make a decision (Haidt,

2001). The absence of a positive correlation could be interpreted as evidence that

affective signals have some bearing on the decision-making process, which is beyond

what traditional models contemplate (Epstein & Pacini, 1999; Haidt, 2001; Marcus et al.,

2000).

Also analyzed as part of this study was the nature and quality of participants’

evidence and reasoning about the decision questions, based on a content analysis of the

evidence they offered in support of their decisions, the sources of this evidence, and what

 participants said about their decisions and the reasons for their decisions. Specifically,

this study examined whether the reasons participants offered to explain their decisions

were relevant to the decision questions and were supported by reliable evidence. Also,

 participants’ ability to generate counterarguments to their own position on each policy

question was measured. Tallying the number of justifications and counterarguments

 participants provided made it possible to measure how much decision-specific

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 19/282

9

information each participant could report for each decision topic. At the same time, the

interview protocol included questions designed to evaluate the source and quality of 

 participants’ reasons for each decision. A finding that participants made complex policy

decisions with certainty and with little or no decision-specific information would

undermine the existing view that reasoning about consciously-available information

causes decisions (Kuhn, 1991; Lau & Redlawsk, 2001; Lupia et al., 2000a; Perkins,

Farady, & Bushey, 1991).

The review of literature on which this study is based revealed no prior decision

research that examined questions of response time, certainty, or evidence quality

concurrently in connection with complex policy decisions, as was done in this study. It

appears this gap in the literature exists for two related reasons. There is a pervasive

assumption in the domains of political science and economics that decisions are in all

instances the products of conscious processes, so there has been no reason to investigate

the possibility that preconscious operations influence decisions because it has been

assumed that decision making is an entirely conscious process. Also, decision researchers

appeared slow to respond to the findings from social psychology and cognitive science

discussed previously.

The data collected in this study investigated the hypothesis that preconscious

 processes influence decision making and the assumption that conscious reasoning

 precedes decision making in all instances. Any material inaccuracy in existing decision

theories or models is significant, if the goal is to use formal education to improve

decision making and reasoning, because educational programs to improve decision

making must be based on accurate models of how people make choices, why people

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 20/282

10

make one choice and not another, and how and why the decision-making process differs

for different people and different decisions. In other words, decision models must reflect

empirical evidence about the decision-making process. As explained in Chapter II, the

dominant models of decision making may not be descriptive or accurate since they do not

reflect the latest evidence on how people actually make decisions about complex

questions (Evans & Over, 1996; Green & Shapiro, 1994; Kahneman & Tversky, 2000).

Comparing Decision-Making Processes for Two Topics

In this study, participants made one decision about a more familiar decision topic

and a second decision about a less familiar topic. Asking participants to make decisions

about two decision topics in this way made it possible to examine how information and

experience might bear upon participants’ decision making about complex policy

questions. It was also possible to investigate how the decision-making process varied

within and between individuals for different topics. Relying on Slovic et al.’s (2002)

work on the affect heuristic, there was reason to believe that participants would not make

decisions about an unfamiliar policy question based on decision-specific information

 because they were not likely to have such information. If Slovic et al. (2002) are correct,

 preconscious signals may substitute for consciously-available information when the

decision topic is less familiar. For the more familiar decision topic, however, participants

may make their decisions based on decision-specific and consciously-available

information. This finding would suggest that traditional decision models are inadequate

to describe how people make decisions about complex topics that are unfamiliar.

The phrase “less familiar” means that participants generally should not have

thought about the topic or discussed it often or explicitly with anyone previously, and if 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 21/282

11

they had, that their exposure to the issue have been cursory. In the present study, the

decision question about replacing the school board in the respondent’s school district

with a private company, in effect privatizing public education without any indication that

the school district was underperforming or failing, was designed to be novel or less

familiar. None of the well-functioning suburban school districts in either of the two states

whose legislators were interviewed (all the legislators were from suburban districts) had

replaced their boards with a private company; and, to my knowledge, this proposal had

not been raised in the legislators’ districts. Therefore, if participants had thought about or 

discussed previously the question of privatizing public schools, it was anticipated that

they would not have been exposed to the specific question of replacing the school board

in their legislative district.

By comparison, the question about whether or not to limit class size to 25

students in all public schools was intended to be more familiar to study participants in

that participants should have heard about this issue before and should have had some

information about the advantages and disadvantages of class size limits. This second

question was likely to be more familiar given that all study participants would have

attended primary and secondary schools and would have had some personal experience

with the class size issue.

Comparing Decision-Making Processes of Two Groups

Comparing the decisions and interview responses for two groups could also

 provide evidence of how information and experience shape the decision-making process

about complex questions, by focusing attention on the ways in which the decision-

making process varied between groups. In particular, including two groups of 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 22/282

12

 participants made it possible to examine how the evidence that legislators and doctoral

students offered in support of their decisions compared, whether one group had more

decision-specific information available for one or both topics, and whether one group

was more certain or more quick to make decisions about complex questions. Including

legislators and doctoral students in this study could reveal patterns and differences in

decision making that could relate to education level and professional experience, among

other things, and how these characteristics shape the decision-making process about

complex questions.

For example, based on Kuhn’s (1991) findings about graduate students in

 philosophy, it was hypothesized that the doctoral students in the sample would be more

circumspect and less certain in their decisions and interview responses than the

legislators, and more likely to acknowledge the limits of their knowledge about the

issues. At the same time, given that the doctoral students were in the process of studying

education through coursework and research, there was reason to believe that these

students would have more decision-specific information about the decision questions.

The two groups would potentially mention different types of evidence in explaining their 

decisions. For example, when they made a decision about whether or not to limit class

sizes to 25 students, legislators might point to the political consequences of raising taxes

to limit class size while doctoral students might focus on the empirical evidence about

the benefits of smaller classes.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 23/282

13

Statement of the Problem

A considerable body of research suggests that decision making is influenced by

 preconscious processes, and that a decision may in some cases precede explicit reasoning

about the decision question, the consciously-available alternatives from which the

decision could to be made, and the consequences of each alternative (Bargh et al., 1996;

Bargh & Chartrand, 1999; Denes-Raj & Epstein, 1994; Evans, 1996; Nisbett & Wilson,

1977; Zajonc, 1980). Thus, in some cases, conscious thought may serve only to generate

reasons that make sense of or justify a decision already made preconsciously. Also, these

consciously-available reasons may not be the ones that actually led to the decision

(Nisbett & Wilson, 1977).

Together, these findings have significance for the study and understanding of 

decision making about complex questions, including questions of public policy, but they

have only recently received attention from decision making and political theorists. To my

knowledge, no study has addressed the specific question of whether political decision

making is influenced by preconscious processes that precede and interact with explicit

reasoning. Further, there appeared to be no research in any domain that has addressed the

question of preconscious decision making about everyday, complex policy questions in

an interview study. This gap in the decision-making literature may be the result of an

implicit assumption across disciplines that decisions about complex questions are in all

instances the product of conscious processes (e.g., Lupia et al., 2000a). Thus, the present

study was designed to test whether there was evidence that preconscious processes

influenced political decision making and reasoning and to examine the related questions

of how knowledge or experience with regard to a political question influenced decision

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 24/282

14

making and reasoning and how the decision making and reasoning of legislators and

doctoral students compared.

Purpose

The purpose of this study was to test the hypothesis that decisions about complex

questions may be influenced by preconscious processes. Therefore, in some cases,

reasoning may serve to justify or explain a decision already made. Testing this

hypothesis in an interview study could reveal that preconscious processes shape decisions

about complex questions, which is contrary to the widely-accepted assumption in the

economics and political science literature that decisions are the products of conscious

reasoning alone. This finding would have important implications for the design of 

educational programs to improve reasoning and decision making.

Data Sources

To address this global purpose, this study relied primarily on interview data from

state legislators and doctoral students in a college of education. Content analyses of these

interviews were conducted and categorical variables pertaining to sources and quality of 

evidence, nature of counterarguments, participants’ certainty in the accuracy of their 

decisions, participants’ self-assessed knowledge about the decision topics, partisan

characteristics of decision topics, and novelty of decision topics, among others, were

identified. Along with these categorical variables, latency data were collected to cross-

validate trends in the interview data and to address the possibility that decisions did not

in all instances follow conscious reasoning about decision questions.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 25/282

15

Definitions

 Affect refers to a preconscious signal that could influence conscious reasoning and

of which the decision maker is aware. Affect is one of various preconscious signals and it

is distinguished by the fact that the decision maker is aware of his or her affective

response to a decision alternative, though not necessarily aware of the reason for that

response. The terms affect, emotions and feelings are used interchangeably herein.

Certainty refers to how sure participants said they were about the policy decisions

they made as part of this study. Question 3 in the interview protocol in Appendix A asked

 participants to rate how sure they were that their policy decision was correct, on a scale

consisting of four choices: “not certain,” “somewhat uncertain,” “somewhat certain” and

“certain.”

Complex decisions or questions are those that require the selection of one

alternative from a set of two or more alternatives whose outcomes or consequences are

uncertain because they involve the interaction of many causes, effects, actors and other 

variables over time, for which there are no certain optimal answers, and for which it is

not possible to consider the outcome of all possible decision alternatives in finite time

 because of the combinatorial explosion of alternatives or outcomes and the computational

complexity of trying to reach an optimal result (Cherniak, 1986).

Conscious processes are composed of “mental acts of which we are aware, that

we intend (i.e., that we can start by an act of will), that require effort, and that we can

control (i.e., we can stop them and go on to something else if we choose)” (Bargh &

Chartrand, 1999, p. 463). Reasoning is conscious. The relevant distinction between what

is conscious and what is preconscious is that conscious refers to intentional and effortful

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 26/282

16

 processes of which we are aware, while preconscious processes are not intentional or 

effortful, and we may not be aware of their operation.

Consciously-available refers to information or reasons we can recall from long-

term memory and report to explain and support a decision.

 Decision refers to the selection of one alternative from a set of two or more

alternatives. A decision can be the product of preconscious or conscious processes, or 

 both.

 Decision making and decision-making process refer to both the preconscious

 processes and the conscious processes that become active when one is faced with a

decision task that may lead to a decision. However, it should be clear that for the

 purposes of this study the term decision making does not necessarily refer to a conscious

 process.

 Decision-specific information refers to consciously-available information that is

directly relevant to a specific decision task. For instance, in deciding whether to limit

class size to 25 students in all public schools in the state of Florida the information in a

study on the effects of class size reductions in Kentucky is decision-specific. Whether 

information is decision-specific is a matter of degree and it depends in part on the nature

of the decision task. If the decision task is general, more information may be directly

relevant or  specific to it.

 Emotions, for the purposes of this study, are preconscious processes of which we

are aware, because they are accompanied by a feeling or other signal we can report.

Emotions, for present purposes, are a subset of preconscious processes.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 27/282

17

 Intuition and intuitive refers to the holistic preconscious assessment or decision

that may be reached in response to a decision task. In recent work on affect and decision

making (e.g., Gilovich, Griffin, & Kahneman, 2002), and in Haidt’s (2001) social

intuition model of moral judgment, the term intuition is used to describe mental

 processes that influence judgment and reasoning but that are not conscious and are not

reasoning. The term intuition was adopted here because its use is being established in

relevant literature and because its meaning is accessible to lay readers. However, in the

 present study the term intuitive decision was used instead of intuition alone or intuitive

 judgment to make clear that the focus of this study is on decision making and to contrast

the intuitive decision with the reasoned decision that results from conscious reasoning.

 Preconscious process, also referred to herein as a preconscious influence, is

defined as any mental operation or process that takes place “not only effortlessly, but

without any intention or often awareness that it was taking place” (Bargh & Chartrand,

1999, p. 464). Preconscious processes also include those “intentional, goal-directed

 processes that became more efficient over time and practice until they could operate

without conscious guidance” (Bargh & Chartrand, p. 463), those processes that could be

also be described as “schema” or “procedures.” Preconscious processes are “not the

 product of deliberate processing, but of quicker, more reflexive processes that are less

available to conscious intervention” (Gilovich & Griffin, 2002, p. 16). The relevant

distinction between what is conscious and what is preconscious is that conscious refers to

intentional and effortful processes of which we are aware, while preconscious processes

are not intentional or effortful, and we may not be aware of their operation.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 28/282

18

Research Questions and Hypotheses

1. Do the decisions of state legislators and doctoral students in a college of 

education about two educational policy issues, and their responses to interview

questions about their reasoning on those issues, provide evidence of the influence

of preconscious processes on decision-making and reasoning about policy issues?

Preconscious processes would be indicated by: (a) the amount of time it takes

 participants to make a decision compared to the amount of time it takes to provide

reasons in support of the decision; (b) the sources and quality of evidence they

offer in support of their decision; (c) participants’ certainty in their decisions

relative to the amount and quality of the information they report about the

decision topic; (d) participants’ report of an affective response to the decision

topic; (e) participants’ choice of a purely conscious or an intuitive decision model

to illustrate the decision making and reasoning process; and, (f) how quickly

 participants report having made their policy decision.

2. Do the decision-making and reasoning processes of state legislators and doctoral

students differ for more familiar and less familiar policy issues?

3. Do state legislators and doctoral students in a college of education decide and

reason differently about educational policy issues?

Based on the literature reviewed in Chapter II, it was hypothesized that all three

research questions could be answered in the affirmative. As for more specific hypotheses,

it was predicted that:

• Participants would make decisions more quickly than they would generate

reasons to explain their decisions.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 29/282

19

• The decision question concerning privatization of public schools would be less

familiar than the question about limiting class size.

• How much participants knew about each decision topic would influence how they

made decisions.

• Participants’ certainty in the correctness of their policy decisions would be based

on an affective signal in some cases rather than on a conscious evaluation of their 

state of knowledge on the policy question.

• Legislators would be more certain about their decisions than graduate students.

• The nature and quality of participants’ evidence would support the conclusion

that their policy decisions were not in all instances based on reasoning about their 

decision-specific information.

• Graduate students would make decisions more slowly than legislators.

• Graduate students would offer more justifications or decision-specific

information in support of their decisions than would legislators.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 30/282

20

CHAPTER II

REVIEW OF RELEVANT LITERATURE

This chapter sets forth the theoretical and empirical support for the hypothesis

that decisions about complex questions may be influenced by preconscious processes,

and therefore in some cases reasoning may serve to justify or explain a decision already

made as a result of preconscious processes. This hypothesis is the basis for the first

research question, which investigates whether people exercise less conscious control over 

complex policy decisions than is assumed in the judgment and decision making and

 political science literatures, since such decisions may be more like the automatic

responses to attitude objects and evaluations of other stimuli found in studies of affect

 primacy (Zajonc, 1980) and the automatic evaluation effect (e.g., Bargh et al., 1996) than

imagined in the decision-making literature on utility maximization and reasoned analysis.

This chapter also includes a discussion of knowledge and experience as they relate to the

second and third research questions, which concern whether and how (a) participants’

decision making and reasoning differ for each of the two decision tasks and (b) decision

making about the decision tasks differs between the two sample groups. Although it may

seem obvious that decisions differ and people differ, this point is often neglected in

studies of decision making and reasoning.

The literature reviewed in connection with these three research questions is

organized into three sections in this chapter. The first section on decision making and

 preconscious processes includes literature that pertains to all three research questions.

The second section on political decision making reviews theories and empirical findings

about political decision making that relate to the first research question and are consistent

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 31/282

21

with the hypothesis that preconscious processes influence decision making about

complex policy questions. A third section on knowledge, experience, and expertise

introduces literature concerning experience and knowledge to be considered in

connection with the second and third research questions.

Literature Selection Criteria

This review of literature is intended to be comprehensive in its coverage of 

theories and findings that suggest that preconscious processes influence decision making

or conscious reasoning about complex questions. The sections on decision making and

 preconscious processes and political decision making include a discussion of, or at least a

citation to, every study or theory found while reviewing literature that was directly

relevant to the first research question about the influence of preconscious processes on

decision making and reasoning about policy questions. That it was possible to review

every study that related directly to the question of whether decision making about

complex questions is subject to preconscious influence reveals how little theoretical and

empirical work addresses the interaction of preconscious and conscious processes in

complex decision making or reasoning tasks. By contrast, the review of literature for the

second and third research questions in the section on “Knowledge, experience, and

expertise” only surveys relevant literature on expertise or knowledge since these bodies

of literature are too large to review exhaustively here.

Decision making is a subject that is within the purview of a wide range of 

intellectual disciplines or areas of study, including judgment and decision making,

several branches of psychology, political science, education, cognitive science,

economics, law, philosophy, and business and management. Given the number and

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 32/282

22

diversity of articles, chapters, and books on decision making in all its guises, it is

surprising that so few of these sources consider or investigate the possibility that

 preconscious processes might shape thinking about complex questions. All of the sources

reviewed that considered or investigated this possibility are included in the sections

entitled “Decision making and preconscious processes” and “Political decision making.”

The remaining and much larger body of decision research, which makes no reference to

 preconscious processes, is summarized in the first of these sections under the heading

“Traditional model of reasoning and decision making.” This summary describes the

dominant decision-making model and its most significant shortcomings.

With regard to the specific selection criteria for publications discussed in

connection with the first research question, literature was included if it met one or more

of the following criteria: (a) its hypotheses were similar to the central hypothesis of this

study that preconscious processes may influence decision making about complex

questions (Evans, 1996; Haidt, 2001; Peters & Slovic, 2000); (b) it investigated decision

making about complex real world problems (Kuhn, 1991); (c) it proposed that political

decision making was the result of affective or preconscious processes (e.g., Marcus et al.,

2000) or that political decisions were not in all instances the result of reasoning about

consciously-available information (e.g., Geva, Mayhar, & Skorick, 2000; Lodge, 1995);

(d) it provided empirical support for the central hypothesis that decision making could be

a preconscious or intuitive process (e.g., Bargh et al., 1996; Bargh & Chartrand, 1999;

Zajonc, 1980); (e) it provided evidence that the reasons we offer for our decisions and

actions are not necessarily the ones that caused them (Nisbett & Wilson, 1977); or (f) it

 provided theoretical explanations for why the decision-making process might begin with

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 33/282

23

a decision that is followed by reasoning (e.g., Damasio, 1994; Epstein & Pacini, 1999). If 

literature did not meet one or more of these criteria, it is not discussed in detail, even if 

the work figures prominently in the study of judgment and decision making in other 

studies or disciplines.

Many relevant theories and studies of decision making were not included in this

chapter because they were not essential to the narrow focus of the present study,

necessary to sustain the viability of the central hypothesis or relevant in investigating and

answering the research questions. The most prominent exclusion is the literature on

cognitive heuristics and biases (Kahneman et al., 1982). Until very recently, heuristics

and biases research did not contemplate preconscious decision making, and, except as

described in the section on affect as a substitute for consciously-available information

(e.g., Slovic et al., 2002), does not concern complex, ill-structured tasks or real-world

 political or social problems, so it was excluded.

Additionally, studies and theories concerning attitude formation, conceptual

change and persuasion were also excluded because all of these, including cognitive

dissonance, attribution and balance theories, concern the processing of and the influence

of new information, most often when participants already have a position on an issue, an

impression of a person, or an attitude towards an object. For instance, “ persuasion is the

 process of stimulating change in the way an individual understands or views a particular 

issue or topic by fostering a deeper processing or reflection of that issue or topic” (Buehl,

Alexander, Murphy, & Sperl, 2001, p. 270). Further, “[a]t their core, the literatures in

 persuasion and conceptual change focus on the change process and rely heavily on well-

crafted messages to stimulate such change” (Buehl et al., 2001, p. 270). Since this study

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 34/282

24

does not provide participants with any information and does not examine how

 participants change their positions in response to new information, how participants’

decisions subsequently shape their openness to new information on the decision topic at

some later time, whether participants are willing to change an initial decision as new

evidence is received over time, or to what extent participants protect existing ego

commitments and beliefs, these studies and theories are not covered herein.

Decision Making and Preconscious Processes

As mentioned earlier in this chapter, this section includes literature that crosses all

three research questions. The emphasis in this section is on work suggesting that

 preconscious processes may interact with conscious processes. To contrast such research

with the much larger body of literature based on the reasoning-first-and-then-decision-

making conception of an entirely conscious decision-making process, the first subsection

summarizes the common features of what is denoted herein as the “traditional” model of 

decision making, which the present study challenges. Following the subsection on the

traditional model is a subsection on “Preconscious processes,” which reviews those

theories and findings that suggest that cognitive processes that are assumed to be entirely

conscious might be subject to preconscious influence. Following this is a subsection on

“Reason-based analyses of choice and why reasons are so important,” which reviews

 briefly work on the role of reasons in decision making and then outlines Kuhn’s (1991)

study of informal reasoning. Kuhn’s study provided the methodological framework for 

this study, including a template for the interview protocol, certain variables and coding

schemes, and the content analysis of interview data.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 35/282

25

Traditional Model of Reasoning and Decision Making

The question of “how much conscious control we have over our judgments,

decisions, and behavior is one of the most basic and important questions of human

existence” (Bargh & Chartrand, 1999, p. 463). It is this question that shapes the review of 

relevant literature for the first research question, which asks whether there is evidence

that preconscious processes influence legislators’ and doctoral students’ decisions about

complex questions. The same question stated differently is whether the most widely

accepted models of reasoning and decision making are accurate and complete. These

models of reasoning and decision making go by many names in many disciplines, but for 

our purposes they are identical because they share one important feature, or defect.

Whether a theory of decision making or choice is labeled as formal, normative, expected

utility, utility maximization, rational choice, public choice, social choice, or cost-benefit,

and whether the field is economics, psychology, politics, or artificial intelligence, for the

 past 50 years prominent models of decision making have assumed that decisions are the

 product of reasoning alone, with no reference to preconscious processes. The central

hypothesis of the present study challenges this assumption and the first research question

tests it. Decision models that make no reference to preconscious processes are referred to

collectively in this proposal as the “traditional” model of reasoning and decision making.

Accordingly, the traditional model includes, but is not limited to, normative theories of 

choice, rational choice theories and other utility maximization theories. These labels are

used interchangeably herein.

This study is built on the view that reasoning alone is not and cannot be the

source of all decisions about complex questions (Cherniak, 1986). The hypothesis is that,

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 36/282

26

in some cases, decisions may precede reasoning and, therefore, reasoning may have little

influence on a decision already made based on the operation of preconscious mental

 processes. The traditional model puts reasoning first in time while this study examines

the possibility that decisions might be first sometimes. This focus on what comes first

 begs the question: Why does it matter whether the decision or reasoning about the

decision task occurs first? It matters primarily because, as Bargh and Chartrand (1999)

note in the quotation that begins this section, understanding how much control we

exercise over our thoughts, decisions and actions is essential to understanding: ourselves;

how and why we think, decide, and behave as we do; and whether we have reason to be

satisfied with how and why we think, decide, and behave. Also, as educators and

researchers, it is essential that we understand the decision-making process if we seek to

teach people to make important decisions well.

Figure 1 offers a basic depiction of the traditional model of reasoning and

decision making. There are three major elements in this model: the decision task, the

conscious reasoning process, and the reasoned decision. The decision task is the decision

to be made. The conscious reasoning process consists of all conscious mental operations

that produce the reasoned decision, which is the decision made in response to the

decision task. Such operations could include information retrieval from long-term

memory, research to find additional relevant information, consideration of information

received from other people, and an analysis of the costs and benefits or the expected

utility of various decision alternatives. The dotted arrow from the reasoned decision back 

to the conscious reasoning process depicts how, in some cases, an initial reasoned

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 37/282

27

Figure 1

Traditional Model of Reasoning and Decision Making

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 38/282

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 39/282

29

Figure 2

Intuitive Model of Decision Making and Reasoning

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 40/282

30

and biases research paradigm (e.g., Gilovich et al., 2002; Kahneman et al., 1982;

Kahneman & Tversky, 2000), as well as other criticisms of normative theories of choice

or decision making (e.g., Evans & Over, 1996; Green & Shapiro, 1994), it is well

established in the literature on judgment and decision making that expected utility,

rational choice, and other normative models of decision making or choice are not

descriptive, in that they do not reflect how people actually make decisions (Ajzen, 1996;

Kahneman & Tversky, 1979). As Zajonc (1980, p. 172) observes, “People do not get

married or divorced, commit murder or suicide, or lay down their lives for freedom upon

a detailed cognitive analysis of the pros and cons of their actions.” The traditional models

are not prescriptive or normative either, although they are often described in the

 judgment and decision making literature as such, because these models are not resource-

realistic: no actual person has the cognitive resources necessary to complete, in a finite

amount of time, the mental operations required of rational decision makers under such

models (Cherniak, 1986). Even in the face of its apparent shortcomings (Ajzen, 1996;

Evans & Over, 1996; Slovic, 1991), the traditional model (which includes rational choice

and other utility maximization models) survives as the most widely accepted and applied

way of representing the decision-making process. This section reviews briefly some of 

the weaknesses of the traditional model as it relates to political decision making.

First, it is important to note that while judgment and decision making theorists

may consider irrational (a) the incomplete consideration of alternatives when making

 political or other important decisions, (b) the premature willingness to accept a plausible,

 but not necessarily sound, explanation of events, (c) certainty in one’s beliefs, or (d) the

unwillingness to revise existing beliefs in the face of strong evidence, there is no

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 41/282

31

evidence in the literature that such tendencies are maladaptive (Stanovich & West, 2000;

Todd & Gigerenzer, 2000). In other words, it may be to our advantage that we do not

conform to normative models of choice, which require examination and comparison of 

all possible decision alternatives to maximize utility. After all, defending one’s thoughts,

 beliefs, values, and attachments, being very confident about one’s abilities, and spending

more time doing and less time thinking are all probably adaptive behaviors, for they

allow us to accomplish tasks and overcome challenges that might be abandoned if given

too much thought.

People will often do whatever they can to maintain their belief systems, which are

the maps by which they navigate the world. Without a model of what the self and

the world are like, of what is true and not true, and of what is right and wrong, a

 person’s life would collapse into chaos and overwhelming anxiety. . . . To operate

effectively, you need to believe that the world is manageable, predictable, and

controllable, at least within certain practical limits. (Epstein, 1998, p. 85)

In light of this, too much thinking may do more harm than good (Todd & Gigerenzer,

2000).

Widely accepted models of political decision making, which include rational

choice, rational actor, public choice, social choice, and game theories, as well as positive

 political economy and economic approaches to politics (Green & Shapiro, 1994, p. xi),

 posit that decision makers ought to and do try to maximize subjective expected utility

when making choices. Stated broadly, expected utility models propose that the normative

way to make a decision is to (a) consider all your alternatives in connection with the

decision and all the consequences of each alternative, (b) rate the value or utility/disutility

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 42/282

32

of each of these alternatives for you (their subjective utility), (c) multiply the subjective

utility of each alternative by the probability (or expectation) that the alternative will be

realized, and (d) select the alternative or choice with the highest subjective expected

utility (Baron, 2000). According to this model, legislators should decide whether to

support or oppose a class size limit of 25 students in all public schools by comparing all

the consequences of supporting and opposing this proposal, rating the utility of supporting

the proposal and of opposing the proposal, multiplying the utility of each by the

 probability it will happen, and then choosing to support or oppose the legislation based on

which alternative has a higher number of utility units. The data indicate that this is not

how legislators or doctoral students make such decisions.

According to normative theories of choice, “several qualitative principles, or 

axioms, should govern the preferences of the rational decision maker” (Kahneman &

Tversky, 1984, p. 4). In particular, all formal analyses of choice incorporate two such

 principles: “dominance and invariance. Dominance demands that if prospect A is at least

as good as prospect B in every respect and better than B in at least one respect, then A

should be preferred to B. Invariance requires that the preference order between prospects

should not depend on the manner in which they are described” (Kahneman & Tversky,

1984, p. 4). However, in their research, Kahneman and Tversky have shown repeatedly

that people make different choices in response to formally equivalent but apparently

different versions of the same choice problem, which means that the invariance axiom

does not hold (Tversky & Kahneman, 1992). Without invariance, normative decision

theory is untenable. This phenomenon of preference reversals is an insurmountable threat

to utility maximization theories of choice, since “[i]t suggests that no optimization

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 43/282

33

 principles of any sort lie behind even the simplest of human choices” (Grether & Plott,

1979, p. 623). Further undermining formal theories of choice, studies show that “decision

making is a highly contingent form of information processing, sensitive to task 

complexity, time pressure, response mode, framing, reference points, and numerous other 

contextual factors” (Slovic, 1991, p. 500). “The normative assumption that individuals

 should maximize some quantity may be wrong. Perhaps . . . there exists nothing to be

maximized” (Slovic, 1991, p. 500).

Unfortunately, notwithstanding the limitations on utility maximization theories in

describing and analyzing how people make political decisions, political scientists continue

to use them to study political decisions and actions at the individual, group, and national

levels (for a review of their application in political science, see Friedman, 1996; Green &

Shapiro, 1994). In reviewing rational choice scholarship in political science, Green and

Shapiro (1994) concluded that “the case has yet to be made that [rational choice] models

have advanced our understanding of how politics works in the real world” (p. 6), “rational

choice theory has yet to deliver on its promise to advance the empirical study of politics”

(p. 7) , and “to date few theoretical insights derived from rational choice theory have been

subjected to serious empirical scrutiny and survived” (p. 9). Based on the foregoing,

utility theories of choice have limited utility for the study of decision making about policy

and other complex questions.

Empirical analyses of the traditional model using logic games and other well-

structured tasks have shown that it is flawed. Based on heuristics and biases research,

criticism of formal decision models is widespread and well supported. However, there has

 been little effort in the literature on judgment and decision making, political science or 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 44/282

34

economics to investigate how adults make complex decisions, and whether there is any

evidence of preconscious influences on such decisions. This study addressed this gap in

the literature by examining in an interview study how policymakers and doctoral students

make policy decision, in an effort to challenge the traditional model’s utility in research

on political decision making.

Preconscious Processes

There is considerable evidence that the brain does quite a bit preconsciously or 

automatically, including self-regulation, sensory perception and affective evaluation of 

environmental stimuli (see Bargh & Chartrand, 1999, for a review of automatic

 processes). What is most significant for purposes of this study, however, is not that self-

regulation functions and perception operate outside of conscious awareness, but that

decision making might–that processes assumed to be entirely conscious may be the result

of preconscious processes (Bargh & Chartrand, 1999; Damasio, 1994; Epstein & Pacini,

1999; Evans, 1996; Haidt, 2001; Loewenstein, Weber, Hsee, & Welch, 2001; McGraw &

Steenbergen, 1995; Nisbett & Wilson, 1977; Sears, 1993; Zajonc, 1980). The evidence

from social psychology and neuroscience has important implications for the study of 

decision making and reasoning, and it supports the hypothesis that decisions about

complex questions may be influenced by preconscious processes. This section on

concentrates on lines of research that detail the nature of those preconscious processes that

may influence the decision-making process.

It must be noted at the outset that the literature cited in this chapter deals with

complex and difficult questions about how the mind works and there is significant

disagreement about some of the literature reviewed herein. The literature in this chapter is

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 45/282

35

not presented as the only or as the definitive work on the processes or questions discussed.

Instead, this literature is presented for what it might reveal about the decision-making

 process, since what the literature reveals is not part of the most widely accepted and

applied models of decision making. The literature provides insights and raises issues that

must be addressed if we are to understand how and why we think, decide and act as we do,

though certain of the lines of research discussed are the subject of continuing controversy

and disagreement.

The Absence of Introspective Awareness and a Reliance on a priori Causal Theories

 Nisbett and Wilson (1977) sought to find empirical support for the view that

“people have no direct access to higher order mental processes” (p. 232). Following their 

review of data from existing studies on cognitive dissonance and attribution processes, for 

example, and their own research to find empirical support, Nisbett and Wilson (1977, p.

233) reached three major conclusions:

1. People often cannot report accurately on the effects of particular stimuli on

higher order, inference-based responses. Indeed, sometimes they cannot

report on the existence of critical stimuli, sometimes cannot report on the

existence of their responses, and sometimes cannot even report that an

inferential process of any kind has occurred. The accuracy of subjective

reports is so poor as to suggest that any introspective access that may exist

is not sufficient to produce generally correct or reliable reports.

2. When reporting on the effects of stimuli, people may not interrogate a

memory of the cognitive processes that operated on the stimuli; instead,

they may base their reports on implicit a priori theories about the causal

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 46/282

36

connection between stimulus and response. If the stimulus psychologically

implies the response in some way or seems “representative” of the sorts of 

stimuli that influence the response in question, the stimulus is reported to

have influenced the response. If the stimulus does not seem to be a

 plausible cause of the response, it is reported to be noninfluential.

3. Subjective reports about higher mental processes are sometimes correct,

 but even the instances of correct report are not due to direct introspective

awareness. Instead, they are due to the incidentally correct employment of 

a priori causal theories.

This section reviews briefly their bases for these conclusions.

In one of the more than 20 studies Nisbett and Wilson (1977) reviewed, Goethals

and Reckman (1973) asked high school students for their opinions on 30 social issues,

including their attitudes towards busing for racial integration. A week or two after the

survey, students were asked to participate in a group discussion about the busing issue.

Each group had three students who were all pro-busing or anti-busing based on their 

survey responses, and one student confederate of the investigators who had been “armed

with a number of persuasive opinions and whose job it was to argue persistently against

the opinion held by all other group members” (Nisbett & Wilson, 1977, p. 236). Students

who were originally against busing “had their opinions sharply moderated in the pro-

direction. Most of the pro-busing subjects were actually converted to an anti-busing

 position” (Nisbett & Wilson, 1977, p. 236). Investigators then asked students to recall

what their original opinions on the busing issue had been, after reminding the students that

the researchers had the students’ original survey responses and would check for accuracy

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 47/282

37

of recall. Control subjects recalled their original opinions accurately. Experimental

subjects, by contrast, did not seem to be aware that their opinions had changed as a result

of the discussion.

[T]he original anti-busing subjects “recalled” their opinions as having been much

more pro-busing than they actually were, while the original pro-busing subjects

actually recalled their opinions as having been, on the average, anti-busing! In fact,

the original pro-busing subjects recalled that they had been more anti-busing than

the original anti-busing subjects recalled that they had been. (Nisbett & Wilson,

1977, p. 236)

It appeared that these students “did not actually experience these enormous shifts as

opinion change” (Nisbett & Wilson, 1977, p. 236). “No subject reported that the

discussion had any effect in changing or modifying his position” (Goethals and Reckman,

1973, p. 499).

 Nisbett and Wilson (1977) also describe a study in which Maier (1931) examined

how aware subjects are of their problem-solving processes. Maier asked subjects to tie

together two cords attached to the ceiling of a laboratory that was “strewn with many

objects such as poles, ringstands, clamps, pliers, and extension cords” (Nisbett & Wilson,

1977, p. 240). The two cords were anchored too far apart for subjects to hold on to one

while taking hold of the other. Some solutions, like tying the extension cord to one cord

and then pulling it towards the other, came readily. When one solution was achieved,

Maier asked subjects to try to solve the problem a different way. What happened next and

its implications are worth describing in detail and verbatim:

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 48/282

38

One of the solutions was much more difficult than the others, and most subjects

could not discover it on their own. After the subject had been stumped for several

minutes, Maier, who had been wandering around the room, casually put one of the

cords in motion. Then, typically within 45 seconds of this cue, the subject picked

up a weight, tied it to the end of one of the cords, set it to swinging like a

 pendulum, ran to the other cord, grabbed it, and waited for the first cord to swing

close enough that it could be seized. Immediately thereafter, Maier asked the

subject to tell about his experience of getting the idea of a pendulum. This question

elicited such answers as “It just dawned on me.” “It was the only thing left.” “I just

realized the cord would swing if I fastened a weight to it.” A psychology professor 

was more inventive: “Having exhausted everything else, the next thing was to

swing it. I thought of the situation of swinging across a river. I had imagery of 

monkeys swinging from trees. This imagery appeared simultaneously with the

solution. The idea appeared complete.”

Persistent probing after the free report succeeded in eliciting reports of 

Maier’s hint and its utilization in the solution of the problem in slightly less than a

third of the subjects. This fact should be quickly qualified, however, by another of 

Maier’s findings. Maier was able to establish that one particular cue–twirling a

weight on a cord–was a useless hint, that is, subjects were not aided in solving the

 problem by exposure to this cue. For some of the subjects, this useless cue was

 presented prior to the genuinely helpful cue. All of these subjects reported that the

useless cue had been helpful and denied that the critical cue had played any role in

their solution. These inaccurate reports cast doubt on any presumption that even

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 49/282

39

the third of Maier’s subjects who accurately reported that they had used the helpful

cue were reporting such use on the basis of genuinely insightful introspection,

since when they were offered a false “decoy” cue they preferred it as a explanation

for their solution. (Nisbett & Wilson, 1977, p. 241)

In another line of research recounted by Nisbett and Wilson (1977, p. 241), Latané

and Darley (1970) have shown “in a large number of experiments in a wide variety of 

settings that people are less likely to help others in distress as the number of witnesses or 

 bystanders increases.” Yet, Latané and Darley found that subjects seemed “utterly

unaware” of the influence the presence of other people had on their behavior.

Accordingly, Latané and Darley “systematically asked the subjects in each of their 

experiments whether they thought they had been influenced [by] the presence of other 

 people. ‘We asked this question every way we knew how: subtly, directly, tactfully,

 bluntly. Always we got the same answer. Subjects persistently claimed that their behavior 

was not influenced by the other people present. This denial occurred in the face of results

showing that the presence of others did inhibit helping’” (Latané & Darley, 1977, p. 124).

In addition to reviewing these and other studies by various researchers, Nisbett and

Wilson (1977) conducted their own experiments and concluded that any introspective

access subjects may have about higher order mental processes “is not sufficient to produce

accurate reports about the role of critical stimuli in response to questions asked a few

minutes or seconds after the stimuli have been processed and a response produced” (1977,

 p. 246). For instance, in one study they showed that by manipulating the warmness or 

coldness of a person’s personality, in this case someone who was portraying a college

instructor on videotape, they could influence subjects’ ratings of that person’s

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 50/282

40

attractiveness, speech and mannerisms, even though subjects concluded the opposite: that

it was their feelings about the individual’s appearance, speech, and mannerisms that had

influenced whether they liked him, not the warmness or coldness of the person’s behavior.

Half the subjects saw a videotape in which the instructor answered questions in a pleasant

and enthusiastic way, the other half saw the same instructor, with the same appearance,

speech and mannerisms, behaving in an intolerant and distrustful way in response to the

same questions. Both groups of subjects were then asked to rate the instructor’s overall

likeability and three specific attributes: appearance, speech and mannerisms. Those who

saw the warm condition liked the instructor much better and a majority rated his attributes

attractive. Those who saw the cold condition disliked the instructor and a majority rated

his attributes irritating. However, when subjects were questioned about whether their 

overall liking or disliking of the instructor had influenced their ratings on the three

attributes, they denied any such relationship. Instead, they suggested that their ratings on

the three attributes influenced their overall liking or disliking, even though the instructor’s

three attributes were the same in both experimental conditions. This is only one of several

examples the authors offered to support their conclusion that people typically are not

consciously aware of the reasons for their evaluations and decisions, a finding which has

enormous importance for the present study.

While Nisbett and Wilson (1977) found abundant evidence of subjects’ lack of 

introspective awareness, they also noted the fact, “obvious to anyone who has ever 

questioned a subject about the reasons for his behavior or evaluations, that people readily

answer such questions. Thus while people usually appear stumped when asked about

 perceptual or memorial processes, they are quite fluent when asked why they behaved as

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 51/282

41

they did in some social situation or why they like or dislike an object or another person”

(Nisbett & Wilson, 1977, p. 232). Or to put it differently, “If we’ve got questions, then

they’ve got answers” (Fischhoff, 1991, p. 621). To explain this apparent inconsistency,

 Nisbett and Wilson

 propose that when people are asked to report how a particular stimulus influenced

a particular response, they do so not by consulting a memory of the mediating

 process, but by applying or generating causal theories about the effects of that type

of stimulus on that type of response. They simply make judgments . . . about how

 plausible it is that the stimulus would have influenced the response. These

 plausibility judgments exist prior to, or at least independently of, any actual 

contact with the particular stimulus embedded in a particular complex stimulus

configuration. (1977, p. 248)

In other words, when we are asked to explain why we decided or behaved as we did, we

do not actually search our memories for the actual reason for the decision or action in this

specific instance. Furthermore it is not clear that the actual reason, which may be

 preconscious, is even available for recall (Damasio, 1994; Epstein & Pacini, 1999).

Instead, when asked to explain a decision we refer to existing or generate new causal

theories about what the reasons for our decision or action could plausibly be given our 

experience and understanding of causal relations, or what the reasons should be given the

standards or expectations of the particular subculture or culture of the person asking the

questions.

 Nisbett and Wilson found that people have little or no conscious access to the true

reasons for their evaluations, decisions, or actions, but that they could easily provide

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 52/282

42

reasons to explain or support their evaluations, decisions, and actions. The reasons

 provided were rarely the actual ones underlying their choices or behavior, however.

Instead, articulated reasons were those that were plausible explanations of their choices or 

 behavior, based on the person’s causal theories about what reasons plausibly explained

certain choices or actions. While the research Nisbett and Wilson cite and conduct does

not directly address adult decision making about complex questions, its implications for 

decision making and reasoning are significant for the present study because their work 

 provides a broad range of evidence that preconscious mental processes have a role in what

we do (and therefore, presumably, in what we decide to do), and that we may not be aware

of or able to report their operation when asked to explain our reasons for doing something.

Affect Independence and Affect Primacy

Zajonc (1980) examined how affect and feelings influence preferences. For Zajonc

(1980, p. 152), preferences include responses to the following questions: “Do you like this

 person?” “How do you feel about capital punishment?” “Which do you prefer, Brie or 

Camembert?” He concluded “that the form of experience that we came to call feeling

accompanies all cognitions, that it arises early in the process of registration and retrieval,

albeit weakly and vaguely, and that it derives from a parallel, separate, and partly

independent system in the organism” (1980, p. 154). This idea of separate systems for 

affective and conscious information processing was also developed by Epstein (1990), as

discussed in a subsequent section.

Zajonc (1980) cited evidence that affect is primary and that conscious thought

comes later. For instance, he cited several studies of the “exposure effect,” wherein

subjects demonstrate an increasing preference for objects (Turkish-like words or Japanese

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 53/282

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 54/282

44

Zajonc (1980) also cited evidence that affect and thought are products of two

separate information processing systems. This is very important in investigating the first

research question as it introduces the idea that preconscious processes operate separately

from, and therefore can exert independent influence upon, conscious processes. In one

study, by Hyde and Jenkins (1969), subjects recalled words they were asked to rate for 

 pleasantness better than words for which they were asked to count the number of letters or 

report the presence of the letter “E.” Rogers, Kuiper, and Kirker (1977) found that

adjectives subjects’ examined for self-relevance were recalled with greater accuracy than

adjectives subjects’ examined for structural, phonemic, and semantic qualities. Bower and

Karlin (1974) had subjects rate photographs of faces for gender, honesty, or likeability,

with subjects showing better recognition memory in the latter two conditions. In these and

similar studies, Zajonc found evidence of a separation between affect and cognition so

that an overall affective impression or attitude might exist separately in the brain from the

cognitive components that contributed to the overall impression.

In summary, according to Zajonc (1980) our affective reaction to a stimulus object

 precedes and is independent of our conscious deliberation about the same.

“Neuroscientists have confirmed and provided additional detail to Zajonc’s argument that

emotional systems evaluate sensory information before and without the involvement of 

conscious awareness. Indeed, these systems perform this task before conscious awareness

gets a crack at even a reduced portion of that same information” (Marcus, Neuman, &

Mackuen, 2000, p. 38). Zajonc recognized that, but did not examine whether, his findings

of affect primacy suggest that decision making is also an automatic or preconscious

 process.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 55/282

45

Decisions are another area where thought and affect stand in tension to each other.

It is generally believed that all decisions require some conscious or unconscious

 processing of pros and cons. Somehow we have come to believe, tautologically, to

 be sure, that if a decision has been made, then a cognitive process must have

 preceded it. Yet there is no evidence that this is indeed so. In fact, for most

decisions, it is extremely difficult to demonstrate that there has been any prior 

cognitive process whatsoever. . . . We sometimes delude ourselves that we proceed

in a rational manner and weigh all the pros and cons of the various alternatives.

But this is probably seldom the case. (Zajonc, 1980, p. 155)

Although Zajonc (1980) articulated the implications of his findings for the study of 

reasoning and decision making over 20 years ago, there appears to be no research about

 political decision making, or any other decision making about complex issues, that

extended his findings to test the hypothesis that such decisions are the product of 

automatic processes. This may be due in part to the controversy concerning Zajonc’s

surmise that affect precedes cognition. As Carlston and Smith (1996, p. 187) observe,

“Criticisms of Zajonc’s views have also accumulated in the years since his affective

 primacy hypothesis was first published . . . [D]espite controversy over more provocative

aspects of the affective primacy hypothesis, there is considerable evidence that some kinds

of affective responses can occur with the rapidity and automaticity that Zajonc suggested.”

The most important issue for the present study is not whether affect or cognition

come first (Clore, 1994; Lazarus, 1994). What is most important is Zajonc’s suggestion

that affective processes might operate independently of conscious processes, and that

affect might influence conscious reasoning. Zajonc’s work is important to the present

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 56/282

46

study because his work provides evidence that preconscious processes become active

quickly in response to attitude objects and may operate separately from conscious

 processes, two properties of preconscious processes that are necessary to maintain the

central hypothesis.

Also, in noting the primacy and independence of affect, Zajonc (1980) articulated

the following principles about affect: affect is basic, affective reactions are inescapable,

affective judgments tend to be irrevocable, affective judgments implicate the self,

affective reactions are difficult to verbalize, and affective reactions may become separated

from content. For purposes of this study and first research question, the last three of these

 principles are the most relevant. The point that affective judgments implicate the self is of 

enormous significance, and is likely the reason that one of the most common findings in

studies of reasoning and decision making is that people protect their theories and beliefs,

and are likely to dismiss evidence that challenges them (Klaczynski, 1997; Klaczynski &

Gordon, 1996; Kuhn, 1991). That affective reactions are difficult to verbalize overlaps

with Nisbett and Wilson’s (1977) findings. Put simply, the separation of affect and content

(consciously-available information) means that we can often remember how we feel about

something without being able to recall the reasons for the feeling. This suggests that

decision making is not necessarily a memory-based examination of consciously-available

information. How this separation relates to political decision making will be discussed

further in connection with online models of political decision making.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 57/282

47

Automatic Evaluation Effect

Building on a line of research initiated by Fazio, Sanbonmatsu, Powell, and Kardes

(1986), Bargh, Chaiken, Raymond, and Hymes (1996) investigated the generality of the

“automatic evaluation effect,” which refers to a process in which the mere presence of a

stimuli object causes people to have an automatic affective evaluation response to the

object, without any mediating conscious process. Following the work of Fazio et al.

(1986) and of Bargh, Chaiken, Govender and Pratto (1992), Bargh et al. (1996, p. 120)

conducted three experiments and concluded that “all attitude object stimuli studied were

shown to trigger an immediate, reflexive, and uncontrollable good or bad response.” This

is an important extension of the research by Fazio et al., and what Bargh et al. (1996) find

is consistent with Zajonc’s (1980) affect primacy and independence hypotheses (for an

expanded discussion on automatic evaluations see Tesser & Martin, 1996). Research on

the automatic evaluation effect is relevant here because this effect may refer to a

 preconscious process that influences decisions about complex policy questions.

Since the research by Bargh et al. (1996) is based on earlier work by Fazio et al.

(1986), it is appropriate to first describe the automatic evaluation research paradigm

developed by Fazio et al. First, subjects spent several minutes indicating their attitude

(“good” or “bad”) towards 92 attitude objects; “the latency and valence of these

evaluations were used to select the 16 attitude objects for each subject,” so that for each

subject there were equal numbers of strong and weak, as well as good and bad, attitudes

for the priming task phase of the study (Bargh et al., 1992, p. 894). Then, in the priming

task phase, subjects were briefly presented (200-ms) with an attitude object word (e.g.,

“landlords”) on a computer screen, followed by a blank screen for 100-ms. Then the

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 58/282

48

subjects were shown an adjective on the screen where the object word had been originally.

Subjects were to evaluate whether the adjective was “good” or “bad” in meaning by

 pressing the corresponding button on an input device.

The computer recorded the time that elapsed between the appearance of the

adjective and the pressing of the button. This procedure was repeated for the 16 objects

selected for each subject. Fazio et al. (1986) designed this task, which was repeated by

Bargh et al. (1992), to investigate the hypothesis that if the adjective was of the same

valence as the object word, subjects would respond more quickly than if the adjective and

object were not of the same valence (i.e. one had no valence) or were of opposite

valences. In other words, this experiment was designed to test whether the evaluation task 

was primed or facilitated by the brief presence of an object word of similar valence. The

time elapsed between presentation of the object word and the evaluation task (the

“stimulus onset asynchrony” or SOA) was only 300ms, which, according to Bargh et al.

(1996), Fazio et al. intended to be

too brief an interval to permit subjects to develop an active expectancy or response

strategy regarding the target adjective that follows; such conscious and flexible

expectancies require at least 500ms to develop and influence responses in priming

tasks. Given an SOA of 300ms, then, if presentation of an attitude object prime

influences response time to a target adjective, it can only be attributed to an

automatic, unintentional activation of the corresponding attitude. (Bargh et al.,

1992, p. 894)

Fazio et al. found a significant automatic activation effect for those object words for 

which subjects presumably (based on the latency of their evaluations for each object) had

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 59/282

49

strong, accessible attitudes, but this effect was not strong for objects for which subjects

 presumably did not possess strong attitudes.

With this as their starting point, Bargh et al. (1996) decided to investigate the

generality of the automatic evaluation effect, to determine whether the effect depended

upon the presence of a conscious evaluation task like pressing a button for a good or bad

meaning or on attitude strength towards the adjectives. Their first experiment was

identical to the Fazio et al. (1986) experiment, with the exception that Bargh et al.

removed the adjective evaluation task “so that the hypothesis of automatic attitude

activation in the absence of a strategic evaluation processing goal could be tested” (1996,

 p. 108). Instead of evaluating the adjectives (that followed presentation of object words)

as good or bad, they were to pronounce them as quickly as they could. The results of this

first experiment indicated that “the automatic attitude evaluation effect is not conditional

on the subject having an explicit, conscious evaluative goal. Removing the evaluative goal

from the paradigm did not eliminate the automaticity effect” (Bargh et al., 1996, p. 112).

Also, removing this feature of the Fazio et al. paradigm resulted in an automatic

evaluation effect that “was equally probable for the subjects idiosyncratically strong and

weak attitudes” (Bargh et al., 1996, p. 112).

In their second experiment, Bargh et al. (1996) eliminated another component of 

the original Fazio et al. (1986) paradigm by removing the prior attitude assessment task in

which subjects spent several minutes evaluating each of 92 object words before

commencing the priming task with 16 subject-specific object words. Instead, Bargh et al.

(1996) preselected strong and weak, as well as good and bad, object word primes based on

data from Bargh et al. (1992). As with the first experiment, Bargh et al. (1996, p. 116)

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 60/282

50

found that removing another aspect of the original paradigm “that might induce an

evaluative processing strategy . . . does not remove the automatic attitude activation

effect. The case for unconditional automatic evaluation effects, in which environmental

stimuli are classified as good or as bad immediately, efficiently and uncontrollably by the

individual, is strengthened by these results.” Bargh et al. (1996) removed any remaining

evaluative aspects of the Fazio et al. paradigm in their third experiment by replacing the

strongly valenced adjectives of the first two experiments, and the earlier work by Fazio et

al. and Bargh et al. (1992) , with adjectives of “less obvious valence.” “The moderate

quality of the target’s evaluations would make it very unlikely that they would induce an

evaluative processing goal” (Bargh et al., 1996, p. 117). The results of this third

experiment confirmed that the automaticity effect continues even after all evaluative

aspects of the Fazio et al. design are removed.

In light of the results of these three experiments and prior work on the automatic

evaluation effect, Bargh et al. (1996) concluded that people have an automatic and

uncontrollable affective evaluation (i.e. good or bad) in response to all the attitude object

stimuli presented. The first research question in the present study tests whether 

 participants have this sort of affective or other preconscious evaluation in response to

complex policy questions. The hypothesis in this study is that the automatic evaluation

effect extends beyond the simple words or objects examined by Fazio et al. (1986) and

Bargh et al. (1996), to the educational policy questions presented to participants in this

study. There is evidence that we may “automatically evaluate all stimuli [we] come in

contact with, no matter how mundane,” before conscious reasoning is activated because

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 61/282

51

there may be “some adaptive purpose served by screening all objects, people and events in

terms of their valence” (Bargh et al., 1996, p. 123).

Bargh et al. (1996, p.123), note that Lazarus (1991) and LeDoux (1989) “have

concluded that stimuli are automatically and preconsciously evaluated in terms of their 

implications for the self.” This would certainly be adaptive if the stimuli were harmful or 

threatening. And there is no reason to believe that when responding to political questions

we are able somehow to bypass the basic automatic and adaptive tendencies of our 

evolved cognitive system, in which affect and automatic processes might dominate,

operate independently of, and become active ahead of conscious processes. In other 

words, it is possible that a preconscious and automatic evaluation process influences the

decision-making process about complex questions.

The Social Intuition Model of Moral Judgment and Moral Reasoning

Haidt’s (2001) work on moral judgment and moral reasoning is closely related to

the present study. The central hypothesis of the present study is similar to Haidt’s

hypothesis that moral judgment (or decisions) may not follow or be caused by moral

reasoning. He offers four reasons for doubting the proposition that moral reasoning causes

moral judgment:

(a) There are two cognitive processes at work–reasoning and intuition–and the

reasoning process has been overemphasized; (b) reasoning is often motivated [by

goals other than accuracy]; (c) the reasoning process constructs post hoc

 justifications, yet we experience the illusion of objective reasoning; and (d) moral

action covaries with moral emotion more than with moral reasoning. (Haidt, 2001,

 p. 815)

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 62/282

52

Accordingly, Haidt proposes a social intuition model of moral judgment, in which

the eliciting situation prompts quick moral intuitions that cause moral judgment, followed

 by slow, ex post facto moral reasoning. In the present study, the same process is

hypothesized to operate in connection with political decision making, even though

 political decision making is assumed to be qualitatively different than moral judgment

given that political issues are not limited to judging a person’s character or actions, which

is how Haidt defines moral judgments, and given that moral issues are more emotionally

significant or deeply felt than many political questions.

 Notwithstanding the fact that Haidt’s work and social intuition model are focused

on a more narrow set of judgments or decisions, his use of the term “intuitive” was

adopted here to describe the initial preconscious decision in the IDMR model in Figure 2.

Although Haidt’s model is highly relevant to the present study, the central hypothesis was

developed in advance of exposure to Haidt’s research. This fact, in addition to Haidt’s

reference to Nisbett and Wilson, Zajonc, Fazio, and Bargh, whose work helps inform

Haidt’s, explains why Haidt appears after these other sources in this chapter.

Haidt’s (2001) social intuition approach is an interpersonal model of moral

 judgment, so the judgment and reasoning of person A can influence the intuitions of 

 person B, not only the judgment and intuitions of person A. However, “[t]he core of the

model gives moral reasoning a causal role in moral judgment but only when the reasoning

runs through other people. It is hypothesized that people rarely override their initial

intuitive judgment just by reasoning privately because reasoning is rarely used to question

one’s own attitudes and beliefs” (Haidt, 2001, p. 819). It is important to make clear that,

unlike Haidt, this study takes no position on whether post-judgment reasoning can change

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 63/282

53

 judgments and intuitions, and whether this change is rare or frequent if it is possible. This

question is one better left for studies of conceptual change or persuasion. There is no

evidence in the literature that post-judgment reasoning cannot change a judgment already

made, however, or that reasoning cannot over time change the preconscious intuitions that

led to it. Epstein (1998) theorizes that thought can change feelings, since our emotional

response to certain events is the result of how we consciously evaluate the situation. In

sum, showing that judgment precedes reasoning says nothing about the likelihood of 

conceptual change or persuasion.

To clarify the distinctions Haidt (2001) makes between intuitions, judgment, and

reasoning, it is appropriate to include his definitions for these terms. Moral judgments are

defined as “evaluations (good vs. bad) of the actions or character of a person that are made

with respect to a set of virtues held to be obligatory by a culture or subculture” (Haidt,

2001, p. 817). In view of this definition, political judgment encompasses moral judgment

as the latter relates to candidate evaluations or evaluations of policies espoused by specific

 people. However, political judgment also encompasses more than moral judgment;

 political decision making also involves issues, policies, and other abstract ideas or goals

that do not concern the actions or character of a person but rather the actions of a group or 

an institution, and the consequences of group and institutional action over time. Moral

reasoning is “conscious mental activity that consists of transforming given information

about people in order to reach a moral judgment” (Haidt, 2001, p. 818). Finally, moral

intuition can be defined as “the sudden appearance in consciousness of a moral judgment,

including an affective valence (good-bad, like-dislike), without any conscious awareness

of having gone through steps of searching, weighing evidence, or inferring a conclusion”

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 64/282

54

(Haidt, 2001, p. 818). To use Haidt’s terminology, the present study investigated whether 

intuition plays a role in political decision making.

In support of his social intuition model, Haidt (2001) presents evidence from a

number of disciplines. He cites Zajonc (1980), Bargh et al. (1996), and Fazio et al. (1986)

to show that affective evaluations occur automatically, which supports Haidt’s hypothesis

that moral judgments are automatic. Haidt cites dual-process theories from social

 psychology (Chaiken & Trope, 1999) as evidence that moral judgments, like certain other 

 judgments, can be the result of intuitive processes. The literature Haidt considers most

relevant to his model, however, is on attitude formation. Haidt finds guidance in evidence

that indicates that “attitude formation is better described as a set of automatic processes

than as a process of deliberation and reflection about the traits of a person,” or that

“[p]eople form first impressions at first sight, and the impressions that they form from

observing a ‘thin slice’ of behavior (as little as 5 [seconds]) are almost identical to the

impressions they form from much longer and more leisurely observation and deliberation”

(Haidt, 2001, p. 820). Haidt also finds support in the work of Bargh and Chartrand (1999),

Damasio (1994) and Nisbett and Wilson (1977).

Like the present study, Haidt’s (2001, p. 819) hypotheses about moral judgment

“involve more complex social stimuli than the simple words and visual objects used in the

automatic evaluation studies” referred to in the previous section on the automatic

evaluation effect. Notwithstanding this, like Haidt, this study also relied on the evidence

from automatic evaluation studies and studies of attitude formation to support the

investigation of whether even complex decisions about political issues are subject to

 preconscious influence. Although unsupported intuitions of the sort Cosmides and Tooby

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 65/282

55

(1994) warn against must be avoided, there is no reason to believe that political decision

making is any more sophisticated in its initial stages than attitude formation, impressions

about people, or evaluations of stimuli objects in the environment. It can be argued that

the judgment first, reasoning second model Haidt advances for moral reasoning also

describes political reasoning, even though, unlike the moral questions Haidt investigated,

 political questions are not always emotionally salient.

Information Processing May Not Be Motivated by a Search for Accuracy

In light of the foregoing research on introspective awareness and a priori causal

theories, affect primacy, the automatic evaluation effect, and moral judgment, there is

reason to believe that “information processing is not accuracy motivated” (Klaczynski &

 Narasimhan, 1998, p. 176), and that “the desire to preserve existing belief systems and

ego investments is stronger than the desire for consistent, objective reasoning”

(Klaczynski & Narasimhan, 1998, p. 185). A hypothesis of this study is that even in

matters of public policy, where it has been assumed that decisions are the product of 

deliberation, we may begin the process of making a decision with an intuitive decision

that is available and influential before conscious reasoning has begun to weigh evidence

and evaluate possible decision alternatives.

After reviewing a diversity of findings on automatic mental processes, Bargh and

Chartrand (1999, p. 475) wrote: “[s]o it may be, especially for evaluations and judgments

of novel people and objects, that what we think we are doing while consciously

deliberating in actuality has no effect on the outcome of the judgment, as it has already

 been made through relatively immediate, automatic means.” While it remains to be seen

whether the data collected in this study support Bargh and Chartrand’s conjecture, the

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 66/282

56

literature reviewed strongly suggests that many basic assumptions in the decision

literature on the nature and quality of the decision-making process may not be sound.

Theories about the Interaction Between Emotion and Reason

“Neuroscience now implicates emotion not only and obviously in what we are

feeling, but also in how and about what we think, and what we do” (Marcus et al., 2000, p.

38). This section addresses two theories that contemplate the interaction between

 preconscious processes (e.g., emotion, feelings or affect) and reason, and some of the

evidence that human reasoning is intimately connected with, and directed by,

 preconscious processes. The first is Epstein’s (1990) cognitive-experiential self-theory

(CEST), and the second is Damasio’s (1994) somatic marker hypothesis. This study was

initiated because of Epstein’s theory and its implications for decision making, so CEST is

the theoretical basis for the present study. Similarly, although the other research sources

cited in this chapter influenced the research questions and the design of the study,

Damasio’s work may be the single most important source of empirical support for the

 proposition that preconscious processes might influence decision making about complex

questions. We turn now to a discussion of each theory and how each relates to the research

questions.

There are a number of dual-process theories of human information processing,

 particularly in the social psychology literature (see Chaiken & Trope, 1999, for a

collection of these theories), but also in the literature on rational thought and decision

making (Evans & Over, 1996; Marcus et al., 2000; Sloman, 1996; Stanovich & West,

2000), affect primacy (Zajonc, 1980) and moral judgment (Haidt, 2001). Based on a

review of all of these theories, it was concluded that Epstein’s (1990) cognitive-

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 67/282

57

experiential self-theory was the most comprehensive, complete and useful theory, and it is

the theory that provided the overall framework for the present study. It is CEST that led to

the hypotheses that preconscious processes influence complex decisions, which is the first

research question, and that what we know about an issue may determine whether we rely

more heavily on preconscious or conscious processes in making a decision, which is the

second research question. The third research question is closely related to the second, so it

too is a product of Epstein’s theory.

The central premise of Epstein’s CEST is that humans adapt to their environment

 by means of two information processing systems: a preconscious experiential system and

a primarily conscious rational system (Morling & Epstein, 1997). While the two systems

operate differently, they also operate interactively and in parallel. The experiential system

is the one responsible for responding quickly and efficiently to life events on the basis of 

heuristic principles and schemata that are most often inductively derived from emotionally

significant past experiences (Kirkpatrick & Epstein, 1992). This system has a long

evolutionary history, is present in some form in non-human animals, and is intimately

associated with affect. In contrast, the rational system is deliberative, slower, requires

more effort, and is not associated with affect. It operates through an individual’s

“understanding of logical rules of inference” (Epstein & Pacini, 1999, p. 462), instead of 

 preconscious schemata, heuristics, and generalizations that are drawn from emotionally

significant experience.

According to Epstein, we all have preconscious constructs that our brains have

generated and refined from the beginning of our lives to make sense of experience,

without our conscious direction, even before we have the cognitive capacity for rational

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 68/282

58

direction of action. Because the preconscious system (a) has evolved for outcome- and

action-oriented quick interpretation of and response to experience, (b) operates beneath

and prior to conscious thought with little or no conscious effort, and (c) is associated with

affect and, therefore, inherently compelling, Epstein contends that most everyday behavior 

is governed by the preconscious. Through quick interpretations and the associated affect,

our preconscious usually determines our course before the conscious system is activated.

And, even if the conscious system does appear to be involved in decision making, it is

often only to justify or “rubber stamp” the action that feels intuitively most right. The

central hypothesis of this study, about the influence of preconscious processes on decision

making about policy questions, and the first research question are based on these

 predictions.

Epstein (1990, pp. 167-68) explains the operation of the experiential system, in

contrast to the rational system, as follows:

Unlike the rational system, which guides behavior by direct assessment of 

stimuli, the direction of behavior by the experiential system is mediated by

feelings, or “vibes”; these include vague feelings of which individuals are

normally unaware, as well as full-blown emotions of which they are

usually aware. The experiential system is assumed to operate in the

following manner. When an individual is confronted with a situation that

requires some kind of response, depending on past emotionally similar 

experiences, the person experiences certain feelings. The feelings, or vibes,

which can be very subtle, motivate action tendencies to seek to further the

state if the vibes are pleasant and to reduce the state if they are unpleasant.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 69/282

59

The whole process occurs extremely rapidly, so that to all appearances the

 behavior is an immediate reaction to the eliciting stimulus. In humans, the

vibes produce not only tendencies to act in certain ways, but also

tendencies to think in certain ways.

The affective operation of the experiential system is inherently compelling, more so than

the reasoning of the rational system. The affective component is the means by which

 preconscious processes can bias conscious processes like reasoning. However, the

experiential system has its greatest influence on behavior when the individual is not aware

of its operation, so that rational control cannot be exerted. This is why it is so important to

study decision making and gain a better understanding of how and why we think, decide

and act as we do; so that rational control can be exercised over our decision making and

reasoning when appropriate.

In everyday life, the differences between the two systems manifest themselves as

the perceived struggle between heart and mind. We sometimes feel strongly about one

course of action, but know we ought to take another course. Our instincts, feelings, and

“gut” reactions are often at odds with what we consider the rational and prudent path. The

 preconscious processes Epstein (1990) described are depicted in the IDMR model in

Figure 2 as influences on the “intuitive” decision, which is in turn a product of these

 processes. The intuitive decision can also be described as an instinct, a gut feeling, or a

snap judgment, for example. In the IDMR model, the intuitive decision then either 

influences the conscious reasoning process or bypasses it completely, depending on how

strong one’s preconscious signals or feelings are on a topic, resulting in the reported or 

“reasoned” decision.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 70/282

60

Research on CEST has shown that the experiential system can override the rational

system, even when individuals are aware that they are making a decision irrationally on

the basis of what feels like the better choice (Denes-Raj & Epstein, 1994; Epstein, Pacini,

Denes-Raj, & Heier, 1996). This finding has enormous significance for the study of 

decision making since subjects chose the decision that felt better, even though rational

thought suggested a different course. Most people in the studies conducted by Epstein and

his collaborators “are aware of two modes of reasoning that correspond to the rational and

experiential systems of CEST, and . . . although people ‘know better’ (from a logical

 perspective), they report that they, like others, would behave in everyday life according to

the principles of the experiential system” (Epstein & Pacini, 1999, p. 466).

If CEST correctly describes the operation of an experiential system, and its

interaction with a rational system, it is very useful in explaining why we make decisions

and behave as we do, and fits well with the emotional-rational struggle we seem to face in

making decisions on complex questions. According to Epstein, the preconscious system

can operate invisibly, which is when it has its greatest influence on behavior. Even when

we are aware of affective influences on our conscious behavior, we have a tendency to

 justify the behavior without recognizing that what we are feeling, and how that feeling

determines how we act, may not comport with how we would decide to act if we

considered our decision more carefully. In everyday interaction, where efficiency and

effortlessness are valued most, the experiential system is adequate. However, in making

decisions where information and reasoning matter we may be misled, in a manner of 

speaking, by the schemata and heuristics that are the data components of the preconscious

system. Information and reasoning matter when we make policy decisions, so it is

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 71/282

61

important to learn whether preconscious processes influence these decisions.

Understanding how we make decisions is the first step to improving our decisions.

Viewing the influence of preconscious, affective processes on human decision

making from different perspectives, Epstein and Damasio (1994) arrive at closely related

conclusions. The similarities between CEST and Damasio’s somatic marker hypothesis

lend additional empirical support to Epstein’s theory and the idea that decision making

about complex questions must be influenced by preconscious processes. Damasio (1994)

observes that reasoning and decision making with respect to personal and social matters

may not be possible, or would be severely compromised, without the benefit of emotional

signals, or somatic markers, that narrow the range of possible response alternatives. In

other words, according to Damasio (1994) the brain preconsciously narrows the range of 

choices available to conscious reasoning, which makes it possible to make decisions and

to make them quickly.

This conclusion is based on the study of at least twelve patients with damage to

their prefrontal cortices who suffered from decision making defects but no other obvious

mental impairments. Such patients are rare because while they suffered extensive brain

damage, the damage had only a limited impact on cognitive functioning. All of these

 patients show a combination of defects in decision making and “flat emotions and

feelings. The powers of reason and the experience of emotion decline together, and their 

impairment stands out in a neuropsychological profile within which basic attention,

memory, intelligence, and language appear so intact that they could never be invoked to

explain the patients’ failures in judgment” (Damasio, 1994, p. 54).

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 72/282

62

In other words, Damasio found that emotion and decision making were connected

in the “biological machinery of reason” (1994, p. 53), and that the absence of emotion

could lead to very serious difficulties in everyday functioning, even when attention,

working memory, “perceptual ability, past memory, short-term memory, new learning,

language and the ability to do arithmetic were intact” (1994, p. 41).

I see feelings as having a truly privileged status. They are represented at many

neural levels . . . But because of their inextricable ties to the body, they come first

in development and retain a primacy that subtly pervades our mental life. Because

the brain is the body’s captive audience, feelings are winners among equals. And

since what comes first constitutes a frame of reference for what comes after,

feelings have a say on how the rest of the brain and cognition go about their 

 business. Their influence is immense. (Damasio, 1994, pp. 159-60)

While common wisdom and Western philosophy suggest that emotion interferes with

rational decision making, and that pure reason is the ideal, Damasio’s findings lead to the

conclusion that “[r]eduction in emotion may constitute an equally important source of 

irrational behavior” (1994, p. 53).

To Damasio, reasoning and deciding are intertwined, if not coequal. He observes

that when you are faced with any situation that involves a choice, your brain creates many

scenarios of possible response options and related outcomes, so “the mind is not a blank at

the start of the reasoning process” (1994, p. 170). This observation is very suggestive for 

the present study, but Damasio spends no more time on it.

Given the diversity of response options and possible outcomes in any situation in

the personal or social sphere, the question for Damasio is how one actually makes a

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 73/282

63

decision in a timely fashion. Normative rationality, which involves consideration of all

 possible response options and all possible related outcomes, is impossible in all but the

most simple and uninteresting decision-making situations. Recognizing the limitations of 

attention and working memory, and the infeasibility of ideal rationality or pure reason,

Damasio postulates that emotional signals, or somatic markers, automatically narrow the

range of possible response options based upon one’s feelings, vibes, or affective responses

to the outcomes related to the various response options. “You do not have to apply

reasoning to the entire field of possible options. A preselection is carried out for you,

sometimes covertly, sometimes not” (Damasio, 1994, p. 189). Those outcomes that

 provoke an unpleasant feeling, however fleeting and subtle, are quickly dispatched along

with their associated response options, so that we may choose from the reduced number of 

response options or choices that remain. This process happens automatically and almost

imperceptibly, much like the workings of Epstein’s preconscious system. In other words,

somatic markers are a special instance of feelings generated from secondary

emotions. Those emotions have been connected, by learning, to predicted future

outcomes of certain scenarios. When a negative somatic marker is juxtaposed to a

 particular future outcome the combination functions as an alarm bell. When a

 positive somatic marker is juxtaposed instead, it becomes a beacon of incentive.

(Damasio, 1994, p. 174)

Thus, somatic markers (i.e. emotional signals or weights) function as a “biasing device”

that makes reasoning in the personal and social realm possible. Without this device,

intellectual paralysis would result whenever one was faced with a decision on personal or 

social matters. Damasio (1999, p. 42) makes clear however that his somatic marker 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 74/282

64

hypothesis does not suggest “that emotions are a substitute for reason or that emotions

decide for us.” This is why Epstein’s dual-process theory is so important; it offers a way

to represent the interactions between automatic and conscious cognitive processes,

without dictating that we are limited to either a normative pure reason model or an

automatic affective model of decision making.

The similarities between CEST and the somatic marker hypothesis are striking.

Both detail the operation of a preconscious process that influences reasoning and decision

making. Both Epstein (1990) and Damasio (1994) hypothesize that the values of the

 preconscious system, whether referred to as vibes, instincts, or somatic markers, are

acquired through an individual’s experiences and operate, for the most part, beneath

conscious awareness. As a result, as Nisbett and Wilson (1977) found, we have little or no

introspective access to why we make the decisions we do, and much of the decision-

making process is never consciously available for scrutiny. Nevertheless, as CEST

 provides, the rational system can intervene to alter a decision already made.

Affect as a Substitute for Conscious Reasoning in Risk Analysis

Recent research on risk analysis has investigated how affect might bear upon

reasoning about risk. These studies suggest that affect may substitute for conscious

reasoning when the decision maker does not have enough consciously-available

information to make the decision without some heuristic or when the subject matter 

 provokes an emotional response (Finucane, Alhakami, Slovic, & Johnson, 2000 (college

students rated risk or benefit of various technologies on a 7 or 10 point scale, under time

 pressure or after reading 3 short vignettes); Ganzach, 2000 (business school students rated

risk or return of unfamiliar and familiar stock markets on a 9 point scale); Peters & Slovic,

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 75/282

65

2000 (in a study of individual differences in affective information processing, college

students completed affective and analytical information processing measures during an

initial session, and in a later session played a card game designed to elicit subjects’

affective responses to gains and losses, after which they rated the various decks of cards

on a 5 point scale); Peters & Slovic, 1996 (as part of a larger national telephone survey

consisting of 155 questions, subjects answered 16 questions designed to elicit (a) images

about nuclear power using word associations rated on a 5 point scale, (b) worldviews, and

(c) an index of nuclear support); Pohl & Hell, 1996).

For instance, Finucane et al. (2000) found that when asked about nuclear power,

subjects’ decisions followed from their affective response to nuclear power, rather than

from a rational analysis of the risks and benefits of nuclear power. Also, Ganzach (2000)

found that when subjects were asked to estimate the risks and returns of investments in

unfamiliar stock exchanges, their estimates of both risk and return originated from a

global evaluation of the stock exchange rather than specific information about the risks

and returns of investments listed on the stock exchange. By contrast, for familiar 

investments the analysis of risk was independent of the analysis of returns, and each

 proceeded from available information rather than a global preference.

These studies on the “affect heuristic” challenge traditional decision models by

introducing the affect heuristic as a preconscious process that substitutes for and operates

in place of conscious reasoning about decision-specific information.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 76/282

66

 Non-Consequential Decision Making

Like Haidt’s work, Evans’s (1996) study of the Wason card-selection task 

 provides direct support, and has important implications, for the central hypothesis of the

 present study of political decision making. Evans’ study is discussed towards the end of 

this section, however, because it involves the selection of cards in a commonly used

“game” to test reasoning skills. The decision task in Evans’ study limits its usefulness in

understanding complex real-world questions. Nevertheless, Evans found that we may

make decisions without thinking about their consequences, and this is almost the same as

saying that preconscious processes influence decision making.

Evans (1996) found evidence of non-consequential decision making (decision

making without reasoning about the consequences of each decision alternative as

 predicted by traditional models) in a study of subjects asked to solve several versions of 

the Wason card-selection task. In the Wason task, subjects are asked to decide which of 

four cards they would have to turn over to test the truth of a conditional statement. Evans

concluded that card selection was determined by preconscious cues of relevance, so that

subjects did not look at or think about every card. Instead, they quickly focused on certain

cards and chose from the cards they focused on. Evans’s conclusions about preconscious

cues of relevance are represented in the IDMR model, and even the title of his article

“Deciding before you think” suggests that its implications are identical to several of the

hypotheses of this study. However, as with almost every other study of decision making,

the task is artificial in that it is very difficult to generalize from behavior in response to a

card selection task to decision making about complex political issues, for example.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 77/282

67

Reason-based Analyses of Choice and Why Reasons Are So Important

In those cases where a decision is not the product of conscious reasoning alone,

what purpose do reasons serve? If it is true that reasoning follows intuitions, judgments,

and decisions, then for most tasks or decisions reasoning may be a literal description of 

the role of conscious thought: to generate reasons that support, justify, or make sense of 

the already-settled judgment or decision, so that thinking can stop and the mind can move

on to other things. Findings from research on informal reasoning (Means & Voss, 1996;

Voss, Perkins, & Segal, 1991) and argument skills (Kuhn, 1991) suggests that in most

cases, most people reason no more than is required to find a plausible reason to stop

thinking and to make a decision.

This section reviews Kuhn’s (1991) reason-based analysis of reasoning about

social issues. Kuhn’s study is important for two reasons here. First, it was the only study

found involving interviews with adults, some of whom might qualify as experts, about

complex social questions, including criminal recidivism and failure in school. Second, the

interview protocol, coding of data, and variables to be measured in the present study,

discussed more fully in Chapter III, are based on Kuhn’s work. As with the preceding

subsections, this discussion of Kuhn’s study also pertains to the second and third research

questions.

Kuhn’s work, and other reason-based analyses of thinking, emphasize the

importance of reasons in understanding our cognitive system. According to Shafir,

Simonson, and Tversky (1993, pp. 617-618) these analyses reveal that “[w]e often search

for a convincing rationale for the decisions we make, whether for inter-personal purposes,

so that we can explain to others the reasons for our decision, or for intra-personal motives,

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 78/282

68

so that we may feel confident of having made the ‘right’ choice.” Nisbett and Wilson

(1977, p. 233) make a similar observation, “the central idea of attribution theory is that

 people strive to discover the causes of attitudinal, emotional, and behavioral responses

(their own and others), and that the resulting causal attributions are a chief determinant of 

a host of additional attitudinal and behavioral effects.” Reasons certainly have a central

role in decision making. The question in the present study is whether there are

circumstances in which reasons alone do not explain how policy decisions are made.

For purposes of this review, the most persistent and relevant findings of reason-

 based analyses of conscious thought are as follows: (a) the premature closure of 

reasoning, possibly because people’s epistemological beliefs and the low value they place

on justified true beliefs do not motivate them to suspend judgment in circumstances where

sustained inquiry is appropriate, and (b) the protection of the self, including existing

attachments, beliefs and theories (Alford, 2002; Granberg, 1993; Haidt, 2001; Hofer &

Pintrich, 1997; Kahneman & Lovallo,1993; Klaczynski & Gordon, 1996; Kuhn, 2001;

Kuhn, Weinstock, & Flaton, 1994). With this in mind, we now turn to Kuhn’s study of 

argument skills and “informal” reasoning. The term informal when used in connection

with reasoning refers to reasoning that does not accord with formal models.

Kuhn’s Study of Argument Skills

In terms of design, Kuhn (1991) recommends that in studying thinking we

conceive of thinking as argument instead of as problem solving, look at the sorts of 

thinking people do in their everyday lives, avoid artificial content and instead use real,

meaningful questions, and consider reasoning about ill-structured problems. These were

all goals of the present study.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 79/282

69

Kuhn elicited subjects’ causal theories about three social phenomena using the

following questions: What causes prisoners to return to crime after they’re released? What

causes children to fail in school? What causes unemployment? In the two interviews that

were conducted with all of the subjects in her study, Kuhn asked the following types of 

questions for each of the three topics: six questions concerning justification of the causal

theory, eight questions about contradictory positions, two questions on instrumental

reasoning, and nine questions on epistemological reasoning. Kuhn investigated subjects’

 justification of their causal theories, their ability to generate alternative theories,

counterarguments, and rebuttals to counterarguments, and their evaluation and use of 

evidence.

Although the present study elicited participants’ decisions, rather than their causal

theories as Kuhn (1991) did, the present study was designed based on Kuhn’s work. As

described in greater detail in Chapter III, the post-decision interview in Appendix A that

was used to interview legislators and doctoral students about the content and quality of 

their reasoning about educational policy decisions was an abridged version of Kuhn’s

 protocol, which is in Appendix B for purposes of comparison.

Briefly, Kuhn’s (1991) findings were as follows. A minority of subjects supported

their causal theories with genuine evidence, as opposed to pseudoevidence or 

nonevidence, and only 16 percent of subjects generated genuine evidence to support their 

causal theories on all three topics. None of the subjects claimed that they were unable to

 provide evidence to support their theories even though the “majority of people do not

appear able to make appraisals of the strength of the evidence they generate” (Kuhn, 1991,

 pp. 93-94). Also, “subjects generating nonevidence or pseudoevidence are as certain as

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 80/282

70

those generating genuine evidence” (Kuhn, 1991, p. 197). Thirty-three percent of subjects

generated alternative theories and 34 percent of subjects generated counterarguments on

all three topics. When asked to evaluate evidence subjects showed “a prevalent, and

disturbing, tendency to assimilate any new information to [their] existing theories” (Kuhn,

1991, p. 268).

If instead of being firmly differentiated from the theory, [evidence] is simply

assimilated to it, any ability to evaluate the bearing the evidence has on the theory

is lost. Not only does this imply the loss of the ability to ever encounter evidence

contradictory to one’s theories[, w]eak boundaries between theories and evidence

imply a confusion between what follows from a given piece of evidence and what

one in general believes to be true. (Kuhn, 1991, p. 268)

“The single most revealing finding in the epistemological category is the high

level of certainty participants claim to have in offering causal explanations of the

 phenomena they are asked about” (Kuhn, 1991, p. 265). “[P]eople confidently ‘know’ the

answers to [Kuhn’s] questions, but in the naive sense of never having contemplated that

the answers could be otherwise” (Kuhn, 1991, p. 265). Kuhn’s findings influenced the

 predictions of this study, for example, that the majority of participants would not provide

external evidence in support of their decisions and would have difficulty generating

arguments that undermined their decisions. Also, the content analysis Kuhn conducted is

similar to the analysis conducted on the data collected in this study.

Causal Theories and Policy Decisions

Throughout this proposal there are references to the influence Kuhn’s work had on

the present study. However, as explained in this subsection, this study examined

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 81/282

71

 participants’ decisions about specific educational policies instead of surveying their causal

theories about what causes children to fail in school, for instance, as Kuhn did.

Why Study Policy Decisions Instead of Causal Theories

While Kuhn asked participants to think about the causes of certain social problems

(e.g., “What causes prisoners to return to crime after they’re released”), this study asked

 participants whether they would support or oppose specific legislation to increase

academic achievement in public schools. There are three relevant differences for our 

 purposes between asking for a decision about specific legislation and asking for a theory

about why some social problem happens or asking in general terms how that problem

should be addressed. The first difference is that asking for a decision about a specific

 policy question is a more demanding task because a decision maker must not only recall

and report information about the decision topic and the decision maker’s understanding of 

the causes of the social phenomena at issue, he or she must also evaluate the quality and

implications of that information with a view towards making the better choice from two

 policy alternatives. In other words, making a policy decision in response to a social

 problem should be a more difficult and time-consuming task than explaining why the

social problem happens, as required in Kuhn’s (1991) study, because making a policy

decision requires additional steps beyond explaining the causes of the problem. Therefore,

making an educational policy decision is not something participants in the present study

should have been able to do quickly and without conscious reflection on the decision

alternatives’ consequences.

Second, asking for a decision made it possible to compare the present study to the

existing decision-making literature in various disciplines and to challenge the assumptions

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 82/282

72

of the traditional model by providing evidence that it does not accurately describe how

 participants make political decisions. Finally, how people make complex decisions shapes

the world in a more direct way than how they explain social phenomena. Decisions about

specific legislation are one step closer to actions than reasoning generally about the causes

of social problems; decisions have a more direct influence on the world. They are,

therefore, more important and interesting.

Selecting Decision Questions

The two decisions participants made in the present study were selected following

 pilot testing of the following four questions about educational policy issues: (a) Would

you oppose or support legislation to enable [name of state] to provide computers for use in

religious primary or secondary schools as a means to improve academic achievement?; (b)

Would you support or oppose legislation to limit class size to 25 students in all [name of 

state] public schools as a means to improve academic achievement?; (c) Would you

support or oppose legislation to transfer management and control of public schools in your 

county or legislative district from the local school board to a private company as a means

to improve academic achievement?; (d) Would you support or oppose legislation to

change how public schools are financed in [name of state] so that the existing system, in

which local property tax assessments provide a major source of funding, would be

replaced by a statewide increase in the sales tax, as a means to reduce the disparities in

financial resources among the various counties? Questions (a) and (d) were eliminated

following pilot testing, as explained in Chapter III.

Decisions about educational policy were selected because they are complex and

important questions, they should be meaningful for legislators and doctoral students in

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 83/282

73

education, polls consistently show that education is one of the most important political

issues in the United States, and they are the author’s primary academic interest. In

 particular, these educational policy decisions are the sorts of decisions legislators make

every day during a legislative session. Though these questions are similar to actual

 political questions participants might consider or have considered, they are different from

the questions that typically appear in the decision-making literature.

Except for the articles cited in the section on the affect heuristic in this chapter,

there appeared to be no decision research that asked for a decision about a problem adult

decision makers would actually face as voters or elected officials. Instead, decision

researchers regularly use logic games, card selection tasks, or hypothetical scenarios that

are apparently dissimilar but logically identical to show how decision makers diverge

from normative theories of choice (e.g., Denes-Raj & Epstein, 1994; Evans, 1996;

Kahneman & Tversky 2000). Even including the literature on the affect heuristic, there

was no research that investigated legislators’ decision making as part of a research study

or through interviews, although political scientists do sometimes evaluate legislators

decisions retrospectively (e.g., Green & Shapiro, 1994; Tetlock, 1994). Accordingly, the

decision tasks in the present study were designed to be more realistic and meaningful for 

 participants than the logic games used in the literature on adult decision making.

Asking two questions, one that was drafted to be more familiar to participants and

one to be less familiar or unfamiliar may help reveal whether preconscious processes

influence decision making about complex questions. For instance, as explained more fully

in Chapter III, if participants made a decision in response to the less familiar question

more quickly or with more certainty than they did for the more familiar question it would

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 84/282

74

suggest that their decision was not the product of conscious reasoning alone. After all, if 

they were actually reasoning through decision alternatives and they could not recall

sufficient decision-specific information participants should not be certain about their 

decision and they should not be able to make a decision quickly, if they could or should

make one at all. I hypothesized that using two decisions would produce results that

showed a difference in levels of prior knowledge for each question and for each sample

group, and differences in how participants decide and reason about the two decision

topics. Thus, using two questions and two sample groups also made it possible to

investigate intra- and inter-individual differences in decision making and reasoning about

the two decisions, including how decision making and reasoning differs between

legislators and doctoral students for each of the questions.

Political Decision Making

To this point, the theories and findings discussed in this chapter have concerned

decision making and preconscious processes generally. In this section we turn to evidence

that relates specifically to theories and research about political decision making, beginning

with the state of political knowledge generally.

Political Ignorance and the Construction of Preferences (and Decisions)

In this study participants made two decisions, and one of these decisions was

designed to be new to participants. The study was designed this way for a number of 

reasons, as evidenced by the three research questions. The first research question

examines, among other things, whether participants made decisions about complex

questions without the quantity or quality of consciously-available information

hypothesized by the traditional model of decision making. The second and third research

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 85/282

75

questions investigate how knowledge, experience, education and other characteristics of 

the decision maker bear upon their decision making about policy questions that do not

lend themselves to simple conclusions and quick decisions. Yet another reason for asking

 participants one political decision about a topic for which they are not likely to have

considerable existing information is that most people vote and make policy decisions with

little or no policy-specific information.

“The widespread ignorance of the general public about all but the most salient

 political events and actors is one of the best documented facts in all of the social sciences”

(Lau & Redlawsk, 2001, p. 951). More significantly for this study, “[e]ven Americans

who are politically well-informed in general may be well be ignorant of highly relevant

 policy specific knowledge” (Gilens, 2001, p. 380). “The political ignorance of the

American voter is one of the best-documented features of contemporary politics, but the

 political significance of this political ignorance is far from clear” (Bartels, 1996, p. 194),

and the two-question design of the present study explored how varying levels of 

knowledge affect decision making (Bartels, 1996; Gilens, 2001; Lau & Redlawsk, 2001;

Lupia, 1994).

So, although the sample consisted of people with some experience with and

interest in matters of educational policy, for the less familiar decision topic it is possible

that how they decided was similar to how novice adults make political decisions. It is

 possible that the “experts” in the present study behaved in a manner consistent with

Hogarth and Kunreuther’s (1995, p. 32) findings in a study of decision making about real-

world tasks, albeit in an experimental setting, “that under ignorance, when people should

 probably think harder when making decisions, they do not. In fact, they may be swayed by

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 86/282

76

the availability of simple arguments that serve to resolve the conflicts of choice.” In other 

words, once some threshold is reached, whether it be a threshold of certainty, impatience,

sufficient evidence, or something else, the decision-making process may terminate, even

for experts, whether the decision is well-supported or not. The concept of a decision

threshold is very important and it receives additional attention in connection with the

discussion of Geva et al.’s (2000) model.

A separate but related point is that when we are asked for our political preferences,

it is not likely that we always consult our memory to find pre-existing preferences.

Instead, it is much more likely that in many cases we construct our preferences on the spot

(Boynton, 1995; Feldman, 1995; Fischhoff, 1991; Lodge, 1995; Sears, 1993; Slovic,

1991). The political theories described in the next section attempt to explain, among other 

things, the origin of political preferences and how poorly informed voters make complex

decisions that are consistent with the decisions they might make given better information

and more time for deliberation.

Theories of Political Decision Making and Preconscious Processes

There are several theories in the political science and political psychology

literature that are consistent with or contributed to this study’s hypotheses about the

influence of preconscious processes on decision making. These are descriptive, not

normative or formal, theories of political choice or reasoning, so they do not make claims

or rely on assumptions that are at odds with what people do or can do when faced with

 political decisions. Also, they constitute the political science response to certain of the

theories and findings discussed thus far in support of the hypothesis that political decision

making is not a purely conscious process. The theories reviewed in this subsection offer 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 87/282

77

several explanations of how people actually make political decisions, and they provide

insight into how people make political decisions in the absence of sufficient and sound

 policy-specific information. As such, these theories informed the data analysis for each of 

the research questions. It is worth noting, however, that these theories do not appear to be

derived from research on the decision making of legislators and policy experts, which is a

significant gap in the existing political science literature on decision making.

Affective Intelligence

Marcus et al. (2000, p. 1) advance a theory called Affective Intelligence, which is

about “how emotion and reason interact to produce a thoughtful and attentive citizenry.”

Marcus et al. are particularly interested in how we attend and respond to political matters,

given that few of us are professionally involved in politics and there are so many other 

more pressing demands on our attention and our time. “Most of the time, most of us

literally do not think about our political options but instead rely on our political habits.

Reliance on habit is deeply ingrained in our evolution to humanity. So when do we think 

about politics? When our emotions tell us to” (Marcus et al., 2000, p. 1).

Affective Intelligence is a dual-process theory composed of two emotional

subsystems: the disposition system and the surveillance system. Both systems in this

theory are subconscious and emotional, which is one way to distinguish this theory from

cognitive-experiential self-theory, in which one system is not subconscious or emotional

(the incorporation of conscious processes is one reason to consider Epstein’ theory the

most complete and useful of the various dual-process theories). The disposition system is

a “comparing system” that monitors three sources of information: somatosensory

information about the body, sensory information about the environment and information

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 88/282

78

about our plans to determine whether an ongoing sequence of action, or plan, is

succeeding or failing (Marcus et al., 2000, p. 47). “As it continuously performs these

comparisons [about the success or failure of our plan], the disposition system influences

emotional outputs, in this case the degree of enthusiasm that in turn is related to the

conscious mood of enthusiasm, attention to task and behavior–the completion of the

ongoing plan” (Marcus et al., 2000, p. 47).

In other words, the emotions of the disposition system provide an ongoing

evaluation of “effort, the prospects of success, the current stock of physical and psychic

resources, and . . . the success and failure of the sequence of actions” we initiate, and this

ongoing evaluation, according to Marcus et al. (2000, p. 9), is what makes strategic action

 possible. The disposition system provides the executive functions that direct habitual

thought and behavior. Or, more specifically, “the disposition system relies on emotional

assessment to control the execution of habits: we sustain those habits about which we feel

enthusiastic and we abandon those that cause us despair” (Marcus et al., 2000, p. 10). By

contrast, the surveillance system is not concerned with enthusiasm but with anxiety as it

“monitors the environment for novel and threatening stimuli. It serves to interrupt habitual

routine and engage thought” (Marcus et al., 2000, p. 53) when anxiety is felt. Whereas the

disposition system is dedicated to those “actions that are already in [the] repertoire of 

habits and learned behaviors,” the surveillance system serves to warn us by increasing

anxiety “when we cannot rely on past learning to handle what now confronts us and to

warn us that some things and some people are powerful and dangerous” (Marcus et al.,

2000, p. 10).

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 89/282

79

Marcus et al. (2000, pp. 63-64) summarize the implications of Affective

Intelligence

regarding political habit and reasoned consideration as follows:

- Unless anxious, people will rely on their political habits to make voting

decisions. Anxiety will undermine the propensity to rely on political habit.

- The absence of anxiety, however, does not automatically mean that reliance on

habits will favor the habitual candidate, party, or program. [There must also be

enthusiasm for the habitual choice.]

- What makes people anxious depends on the habits they have acquired. . . .

- When anxious about candidates, issues, or the times they live in, people will rely

far less on their political habits to guide contemporary choices, will be motivated

to learn, will pay far more attention to contemporary affairs, and will be far more

influenced in the choices they make by the careful consideration of alternative

outcomes. Anxious voters will, in most instances, act very much like the rational

voters as [ sic] depicted by theories of public choice. However, when complacent,

voters will in most instances look very much like the value protecting voters

depicted by [ sic] theory of symbolic politics.

Using the terms relied on thus far in this document, Marcus et al. predict and offer 

empirical support for the prediction that political decision making will be the result of 

 preconscious processes unless anxiety provokes reasoning. Marcus et al. (2000, p. 124)

found evidence that “people use emotions, particularly anxiety, to stimulate active

reconsideration of their political views.” In other words, we use schema or automatic

 processes until we feel threatened, at which point we focus our attention and use

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 90/282

80

deliberate thinking to choose the alternative that seems most appropriate under the new

circumstances. If one had to make a case for what makes preconscious and conscious

 processes adaptive, to explain why conscious systems may have evolved, or to describe

how these processes interact, there may be no clearer way to do it.

Symbolic Politics

As Marcus et al. (2000) noted in their discussion of the implications of Affective

Intelligence, Sears’s theory of symbolic politics concerns the impact of long-standing

dispositions or habits on political decision making and behavior (Sears, 1993). In simple

terms the theory of symbolic politics predicts that long-standing dispositions “provide

stable affective responses to particular symbols” (Sears, 1993, p. 120) that have

considerably more influence over policy and candidate preferences than reasoning or cost-

 benefit analyses. Sears hypothesizes that people acquire these dispositions “through a

 process of classical conditioning, which occurs most crucially at a relatively early age”

(1993, p. 120). Epstein (1990) hypothesizes that the preconscious experiential system is

directed by emotional responses and feelings that accrue from a very young age in a

similar manner. The operation of symbolic processing is not conscious and is in response

to certain political symbols in the decision maker’s environment; the operation of and

reliance upon these stable dispositions provides cognitive consistency in the face of the

enormous diversity of political attitude objects and the enormous complexity of political

issues (Sears, 1993, pp. 120-22). Sears’s theory fits well with the other theories and data

discussed in this chapter, and it provides yet another account of how we operate with a

limited cognitive system in a complex, uncertain, and always changing environment.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 91/282

81

Heuristic and Online Models of Political Decision Making

The theories of affective intelligence and of symbolic politics provide models for 

thinking about the interaction between conscious and affective processes in matters of 

 political preferences, decisions, and voting behavior. These theories also offer 

explanations of how voters make apparently reasonable choices without sufficient

decision-specific information. Both affective intelligence and symbolic politics predict

that people make political decisions on the basis of habits and predispositions they have

formed over time, which is the same as saying that preconscious processes influence

 political decisions. Therefore, without reviewing current information about a particular 

candidate or policy question voters are still likely to make decisions consistent with their 

goals because they vote for the same party or candidate and respond in the same way to

certain important political issues. There are also other theoretical approaches to the

question of how people make decisions with little or no candidate- or policy-specific

information available in memory or acquired through goal-directed search. This section

describes two of these approaches, referred to broadly as (a) political cue or heuristic

models and (b) online or impression-driven models of decision making.

Political cue or political heuristic models account for decision making with little

information by hypothesizing that people can make reasonable decisions by relying on

useful cues instead of conducting an independent and comprehensive information search

and evaluation of each candidate or issue (Bartels, 1996; Lau & Redlawsk, 2001; Lupia,

1994; Sniderman, Brody, & Tetlock, 1991). The most common cues or heuristics used by

voters are party affiliation, candidate’s ideology, endorsements of candidates or issues by

third persons or entities the voter trusts, poll results and candidate appearance (Lau &

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 92/282

82

Redlawsk, 2001). For example, instead of researching each candidate in-depth over the

course of a campaign before primaries and general elections, voters can just vote for the

candidate selected by their preferred political party, recommended by the voter’s friends,

or leading in the most recent polls. There is evidence that participants in the present study

rely on such cues in making policy decisions.

Whereas cue or heuristic models of political judgment posit that political decisions

are made with little candidate- or policy-specific information, online models propose that

we base our evaluations on more information than we can recall when making the

decision. In other words, instead of being able to recall precisely the information that led

to a political decision we can only remember the overall evaluative impression or 

 judgment, referred to herein as the “overall tally” or “overall evaluative tally,” that results

from the exposure to consciously evaluated information over time. This online model

(Lavine, 2002; Lodge, 1995; Rahn, 1995) stand in contrast to memory-based models like

the traditional model. Memory-based models suggest that the abstract information recalled

from memory or collected through research causes political judgment. This memory-based

approach finds support in the strong correlation between memory and judgment (Lodge,

1995).

However, Lodge argues that memory-based models are flawed since, as discussed

in a previous section, voters have very little political information or knowledge, or at least

they recall very little information when asked to explain a political choice. Thus, what

voters can remember provides a weak explanation for their decisions (Lodge, 1995; Lodge

& Stroh, 1993). In response, Lodge (1995, p. 113) offers the online, impression-based

model of candidate evaluation, which is based on the hypothesis that “the information

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 93/282

83

from which conclusions are drawn may be forgotten, while the conclusions are still

retained.”

Under this model, even though a voter cannot recall the specific information or 

evidence that led to a preference for a candidate (or, for our purposes, a decision about a

 political issue), the specific information and evidence were incorporated into the voter’s

overall evaluative tally concerning the candidate or issue at the moment of exposure to the

information. When we think about a person or policy, we remember our overall tally or 

global assessment, not the specific reasons that produced the assessment, which is why,

according to this model, voters seem to lack relevant information but still manage to make

reasonable decisions. “ At best , the citizen’s recollection will represent a biased sampling

of the actual causal determinants of the candidate evaluation. At worst , the correlation

 between recall and judgment is spurious” (Lodge, 1995, p. 114). This latter point is

consistent with Nisbett and Wilson’s (1977) findings.

Lodge notes that the online model is likely to operate when one’s task is

impression formation, and not when the task requires the recall of specific information, so

extant online models may not apply to policy decisions about complex questions.

 Notwithstanding this possible limitation, the idea of an overall tally that we can recall

 based on once-considered information that we cannot may prove very useful in explaining

how people make quick decisions about policy questions they have been exposed to in

some form in the past. The overall tally cannot fully explain decisions of first impression,

since the decision maker has not, by definition, been exposed to policy-specific

information. In a way, the online model of political decision making both supports and

challenges the central hypothesis of the present study. The online model suggests that

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 94/282

84

even if we cannot recall specific evidence in support of a decision, our global assessment

of a candidate or a policy is the result of earlier conscious processing of relevant

information about the candidate or policy.

By contrast, this study is based on the hypothesis that political decisions can be

made without any decision-specific information or conscious information processing. A

second important difference between the present study and Lodge’s (1995) approach

concerns research design. At least for the less familiar topic, this study likely did not

involve impression formation following from exposure to decision-specific information

over time. In other words, if one decision topic is novel then participants will not have

already formed or stored an impression about the issue since they will not have been

exposed to information about that issue.

 Nevertheless, because affect is central to the online approach (Lodge, 1995), and

the overall evaluative assessment that results from information processing is stored as an

“affective tag” (Marcus et al., 2000, p. 26) rather than as a memory of abstract

information, the online models fits well with the hypotheses that political decisions might

 precede reasoning and that reasons offered to support a decision will appear inadequate.

Another feature of the online model that fits well with the theoretical foundations of the

 present study is the recognition that the human cognitive system has certain important

limits on what, how quickly, and how much it can process (Lodge & Stroh, 1993).

The most important contribution of the online model (as described by Lodge,

1995) is that it raises a question that cannot be answered by the theories and studies cited

in this chapter: Are policy decisions the product of (a) explicit information we processed

once through conscious processes but for which we now have only an overall affective

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 95/282

85

evaluation or “tag” (Lodge, 1995), (b) schemata or procedures that are based on a once-

thoughtful and deliberate consideration of explicit information but that are now

 preconscious (Bargh & Chartrand, 1999), (c) preconscious feelings, somatic markers,

 political intuitions, long-standing dispositions or other affective generalizations, drawn

 from emotionally significant experiences, that direct thinking with general principles or 

theories about the world rather than decision-specific abstract information (Damasio,

1994; Epstein, 1994; Haidt, 2001; Sears, 1993), or (d) automatic responses to stimuli

objects that are rationalized by subsequent reasoning, whether or not the responses are

 based on explicit information or an affective generalization based on our experiences

(Zajonc, 1980)? The present study collected evidence to address this question, since the

soundness of the central hypothesis depends upon evidence that explanations (c) and (d)

are tenable.

Before concluding this section it is worth mentioning Geva, Mayhar, & Skorick’s

(2000) implicit theory of international relations, which is a particularly well-developed

model of political decision making that expands upon existing online models by adding

several useful concepts. While Lodge’s (1995) model addresses candidate evaluation,

Geva et al.’s model aims at decision making. To the online model described by Lodge,

which proposes that (a) people process the political information they receive sequentially,

(b) this information contributes to an overall and continuously evolving impression about

a person or issue, and (c) there may be a discounting of later information or at least

anchoring and adjustment from the reference point of the overall evaluation already

established, Geva et al. add the concepts of (d) decision threshold and (e) intercept.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 96/282

86

When we reach a decision threshold, for example when a decision feels right,

decision makers stop thinking about the decision to be made because they have made a

decision. In Haidt’s (2001, p. 829) terms, “We use conscious reflection to mull over a

 problem until one side feels right. Then we stop.” “Similarly, we tend to think of decision-

making as positive. Yet the act of decision, which we often describe as an ‘act’ of free

will, is more of a [negative act] by nature, because what seems consciously to be the

moment of ‘making’ the decision is actually the moment of terminating the process of 

considering alternatives” (Minsky, 1997, 520). Conscious reflection likely ranges from

reasoning that serves only to terminate additional investments of decision making time

and resources so that a decision can be made to reasoning that initiates and sustains

information search and analysis to maximize utility, with closure of the process occurring

only once certain criteria for quality are met. Intercept refers to the point where the

decision maker begins the decision-making process, in terms of the individual’s existing

knowledge, values, and goals, for example. The concept of an intercept point where you

enter the decision space is very useful as a means to represent what we bring to a decision

task, given the evidence that decision making is a contingent, constructivist process

(Lodge, 1995).

Critique of Research on Preconscious Influences

The literature presented in this chapter is one-sided in that all of the studies and

theories in this chapter point toward the influence of preconscious processes on conscious

ones, and suggest that decision making about complex questions is not an entirely

conscious process. The reviewed literature does not represent all sides of the debate on

decision making and reasoning, or on whether preconscious processes invariably precede

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 97/282

87

conscious processes when people evaluate objects in the environment. As shown in the

traditional model, there are other widely accepted ways of representing and thinking about

cognition and consciousness.

It would appear that the reviewed literature points in a similar direction because

the possible influence of preconscious processes on decision making and reasoning about

complex questions has been neglected, at least in the domains of economics and political

science. Whether any or every decision is made automatically or preconsciously is an

open question. Because this question was not being asked in connection with political

decisions, this study was designed to test the possibility of preconscious influence.

However, nothing in this document should be interpreted as a representation that the

current state of knowledge on human cognition is that decision making is the product of 

 preconscious or automatic processes.

Intuitive Decision Making and Reasoning (IDMR) Model

The Intuitive Decision Making and Reasoning model in Figure 2 is a device to

summarize the literature in the prior sections on preconscious processes and political

decision making in the form of a diagram. This visual representation of the IDMR model

serves three purposes: it is an efficient way to bring together the theories and findings in

this chapter and their implications for how the decision-making process could be

conceptualized; the diagrams of the traditional model and the IDMR model help make

clear how the models differ; and creating these visual representations made it possible to

ask participants, as part of the interview, to consider both diagrams and decide which

model more accurately described how most people and how they themselves made

 political decisions. In other words, the IDMR model serves as a counterpoint to the

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 98/282

88

traditional model. A discussion of how the IDMR model represents the findings presented

in this chapter follows.

The IDMR model adds preconscious processes to the traditional model discussed

at the beginning of this chapter in the form of an intuitive decision that precedes conscious

reasoning. The traditional model consisted of a decision task, the conscious reasoning

 process and the reasoned decision. The IDMR model consists of the decision task, an

intuitive decision, the conscious reasoning process, and the reasoned decision, with the

intuitive decision representing the outcome of preconscious processes. At the outset, the

IDMR model assumes that decision tasks vary in terms of their complexity, the time and

resources available to the decision maker (including the availability of additional relevant

and useful information), and how the task intersects with the decision maker’s

characteristics (e.g., task- or domain-specific knowledge, motivation or interest, habits or 

dispositions, and the importance or consequences of the task for the decision maker).

In its present form, the IDMR model posits that after the decision maker is

exposed to the decision task, an intuitive decision is generated, and that decision (which

may present itself as a feeling, instinct, or preference) precedes conscious reasoning about

the decision task. The idea of the preconscious decision and its place in the process are

 based on the theories and research on affect primacy, affect independence, the automatic

evaluation effect, the social intuition model, cognitive-experiential self-theory, and the

somatic marker hypothesis. This representation does not exclude the possibility that

 preconscious processes and conscious reasoning occur concurrently and interactively.

So, the IDMR model could have been drawn to show the intuitive decision and the

conscious reasoning process aligned vertically and operating in parallel, with arrows

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 99/282

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 100/282

90

consequences, and their expected utilities. Thus, the nature of the conscious reasoning

 process will vary by the decision task, the decision maker’s characteristics, and the

operation of preconscious processes, ranging in terms of effort from a quick search for 

 plausible reasons to support and explain the intuitive decision already made to an

intentional, effortful, and deliberate search for additional information to be used in a

reason-based, expected utility, cost-benefit or probability analysis prior to reaching a

reasoned decision. Finally, conscious reasoning may be an iterative process (correcting or 

adjusting the initial reasoned decision with additional reasoning, reaching a second

reasoned decision, and so on), again depending upon the decision task and decision

maker’s characteristics, and this possibility is depicted by the dotted arrow from the

reasoned decision back to the conscious reasoning process.

As part of the interview procedure, participants were shown the two model

diagrams in their present form (Figures 1 and 2). This made it possible for participants to

consider and compare the essential differences between the two approaches to decision

making, and to consider possible shortcomings in the traditional model..

Knowledge, Experience, and Expertise

While terms like decision making and reasoning are often used in this dissertation

to describe a broad range of processes as though individual or contextual differences were

not present or were not relevant, there is evidence of significant differences in how people

reason about and decide identical questions or issues (Kuhn, 1991; Stanovich & West,

2000). “[A]nthropology’s great truth is that we underestimate how and by how much

others see the world differently than we do” (Fischhoff, 1991, p. 637). Decision making is

not something everyone does in the same way.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 101/282

91

At the same time, there are also important differences in the decisions we make.

Decisions about logic games are different than decisions about current policy issues.

Decisions about which heart surgeon, microwave popcorn, chocolate, or political

candidate to select differ in terms of their difficulty, complexity, and significance in

emotional, social, political, or economic terms. “Behavioral studies of decision making

indicate that people use different kinds of strategies for making different kinds of 

decisions” (Fischer & Johnson, 1986, p. 59).

Yet the first research question and the related analyses of data were not designed

to explore evidence of individual differences in decision making. The first research

question concerns whether there was evidence of the influence of preconscious processes

on participants’ decision making about complex policy questions, but does not concern

how participants’ decision making differed, or why their decisions or reasoning might

differ. This is why it became necessary to add the second and third research questions: to

explore the nature of the differences within and between participants and possible sources

of these differences. Accordingly, the study was designed to ask two decision questions

(one about a topic that should be familiar and one about a topic that should be less

familiar) of two groups of participants. This design aimed to produce evidence about

differences in how participants responded to the more and less familiar decision questions

and in how legislators and graduate students responded to the questions.

The two-question design was based on the author’s predictions and evidence in the

expertise literature that participants would decide and reason differently if they had

different levels of prior knowledge and experience on the decision topics (Sternberg,

1997). “A commonsense notion about expertise is that experts differ from novices due to

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 102/282

92

the number of experiences they have had within a particular domain” (Seifert, Patalano,

Hammond, & Converse, 1997, p. 101; Feltovich, Spiro, & Coulson, 1997). Accordingly,

this section cites literature relevant to the second and third research questions to support

the prediction that participants’ decision making on the two topics might differ and that

legislators’ decision making might differ from graduate students’.

For instance, the literature suggests possible differences in how each participant

represents the problem to be answered or the decision to be made for each policy question,

and how legislators and graduate students might compare in the way they represent the

 problems in their decision-making process (Voss, Lawrence, & Engle, 1991; Voss & Post,

1988). Similarly, it is important to consider how participants’ response times and certainty

in their decisions related to the content and quality of the evidence and justifications they

offered in support of their decisions, as well as how legislators and graduate students

compared in how they explained their own decision-making processes (see Bereiter &

Scardamalia (1993) for a general discussion of knowledge and expertise).

While Kuhn’s (1991) work revealed that experts in her study often did not use

more sophisticated reasoning than non-experts, she found that all members of one group

of experts, five graduate students in philosophy, reasoned at the highest levels for all three

social phenomena she investigated. On the question of how experts differ from

experienced non-experts (Bereiter & Scardamalia, 1993; Shanteau, 1992), if either 

legislators or graduate students can be characterized as experts in educational policy,

Kuhn’s study is of limited use because she did not compare experts and experienced non-

experts and because the individuals she labeled as experts would not satisfy the more strict

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 103/282

93

criteria for expertise applied by other authors (Alexander, 1997; Chase & Simon, 1973;

Chi, Feltovich, & Glaser, 1981).

Homel and Lawrence (1992) did not compare different groups of decision makers,

 but their study of two sets of magistrates’ sentencing orientations is relevant to the present

study since they found evidence that magistrates’ decisions were influenced both by their 

own beliefs and orientations as well as by court context, which can be interpreted as

evidence that that there is reason to expect important differences in how legislators and

graduate students make policy decisions. The authors found evidence that “confirmed

 beyond reasonable doubt the substantial contributions of both court context and individual

sentencing style to the determination of penalties” in drunk driving cases (Homel &

Lawrence, 1992, pp. 530-531). Specifically with regard to differences in individual

sentencing style (sentencing is a form of decision making about complex questions), they

found evidence that magistrates relied on idiosyncratic schema when interpreting and

applying data relevant to their decisions (Homel & Lawrence, 1992).

The literature on expertise discusses many dimensions along which legislators and

graduate students might differ, whether or not either group can be characterized as expert.

For example, participants might have varying levels of decision-specific information (also

referred to as “domain knowledge,” Ackerman & Beier, 2003), whether from formal

education or from professional experience in a relevant field. Grigorenko (2003, p. 157)

links expertise to “the relevant knowledge base” and the “amount of training needed for 

the construction of the knowledge base.” Again, based on the expertise literature,

 participants can be expected to differ in terms of relevant information and experience.

This is why it seemed appropriate to ask two decision questions: to determine whether 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 104/282

94

differences in participants’ decision-specific information influenced their decisions and

decision-making processes.

Along the same lines, some participants in the present study might be superior in

“the relevant abilities or necessary resources” associated with making complex decisions

of educational policy, some might benefit from “long-term expertise development” in

educational matters, and some might have, through practice and experience, “acquired

mechanisms that permit them to circumvent the specific limitations in general processing

resources in those tasks or activities relevant to their domains” (Krampe & Baltes, 2003,

 p. 51). Thus, the expertise literature offers many dimensions along which people can

differ when reasoning and deciding.

At the same time, greater domain knowledge and relevant experience may not

necessarily result in more sophisticated reasoning or improved decision making

(Eriscsson, 2003). “[E]xperts in many domains, such as investing, auditing, and clinical

therapy, have not been found to perform at a level superior to other experienced

individuals on representative tasks in their domains” (Eriscsson, 2003, p. 105). Johnson

(1988, p. 211) makes clear that while experts in some domains outperform novices,

“[r]esearch in decision and judgment provides a marked contrast. . . . The results in this

literature present a rather pessimistic appraisal of experts.” In the behavioral decision

literature, compared to novices and linear models, “[t]he superiority of experts to novices

is often surprisingly small, or, in some cases, nonexistent; more disturbing may be the

superiority of trivial linear representations to the performance of carefully trained human

 judges” (Johnson, 1988, p. 212).

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 105/282

95

Kuhn’s (1991) findings were consistent with Ericsson’s and Johnson’s

observations, except that the graduate students in her study did outperform the other 

 participants, both novice and expert. Part of Ericsson’s point in citing numerous studies in

which experts did not outperform other experienced individuals is to emphasize that “the

scientific study of expert and exceptional performance must be restricted to individuals

with reliably superior performance characteristics” (Ericsson, 2003, p. 105) (emphasis

supplied). While the present study does not investigate legislators’ or graduate students’

 performance from the standpoint that they are experts in education or in decision making,

or with a view toward characterizing them as experts, Ericsson’s admonition is a reminder 

that simply because someone performs tasks regularly or occupies a position that

 predisposes others to classify them as experts, judgment of their expertise must await

evidence of reliably superior performance.

Finally, in terms of where the present study fits in the evolution of theories of 

expertise, using Holyoak’s (1991) scheme the present study is in the third generation. This

is so because, unlike first-generation theories this study is based on the hypothesis that

expertise in political decision making depends upon considerable domain-specific

knowledge. And unlike second-generation theories which, “with their emphasis on the

acquisition of more specialized production rules through knowledge compilation, can be

characterized as attempts to explain routine expertise” (Holyoak, 1991, p. 311), this study

is based on the assumption that adaptive expertise is an essential element of what can

 properly labeled expertise. The present study evaluated participants “capacity to handle

novel situations, to reconsider and explain the validity of rules, and to reason about the

[relevant] domain from first principles” (Wenger, 1987, p. 302). Finally, the present study

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 106/282

96

falls into the third generation of theories of expertise because it seeks to examine and

 provide an account for “the most striking aspect of human expert performance: Experts

tend to arrive quickly at a small number (sometimes one) of the best solutions to a

 problem, without serial search through alternative possibilities” (Holyoak, 1991, pp. 313-

314). This is another way of saying that the traditional model of reasoning and decision

making does not accurately describe how experts make decisions about complex policy

questions.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 107/282

97

CHAPTER III

METHODOLOGY

The previous chapter outlined the literature that inspired the research questions and

informed the design of the present study. This chapter provides a more detailed discussion

of that design and the data collected.

Two principal objectives shaped the design and methods of this study. The first

was to seek evidence of preconscious influences on decision making about complex

 policy questions and the second was to include legislators in the study sample. The first

objective has been discussed in depth in Chapters I and II. The second objective, to

include legislators, was the result of revisions to earlier designs that proved inadequate

 because policy decisions may not be meaningful and important for college students, for 

example. After considering college undergraduates, faculty members, legislative

committee staff, doctors, lawyers, and other sample groups, it became apparent that

legislators would be the ideal participants in a study of decision making about complex

 policy questions because elected officials are individuals who make such decisions on a

regular basis, adding to the meaningfulness of findings.

The decision to include legislators shaped the interview protocol, interview

 procedures and settings, the ways in which data were collected and what could be

measured. The legislative participants were essential to this study, but at the same time

these participants were limiting in terms of what could be done to collect evidence of 

 preconscious processes. Specifically, the interview could not be too long because only a

reasonable amount of time could be requested from legislators. The interview questions

could not seem intrusive, redundant, or otherwise inappropriate because there might be

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 108/282

98

significant consequences. Legislators had to be interviewed in their offices or in some

other location of their choosing as a courtesy to them for agreeing to participate and

 because of their time constraints. The interview could be tape recorded, but legislators

could not be connected to any electronic apparatus to measure vital signs or skin response

or subjected to brain imaging scans in a hospital or other medical imaging facility.

Similarly, it would be unseemly and distracting to ask legislators to press a button every

time they made a decision. In sum, this study was designed to be as unobtrusive and

 professional as possible, so that legislators would agree to participate and would complete

the interview with a positive impression of the process.

As a specific example of how participant choice shaped study design, consider the

variables “decision latency” and “analysis time.” In addition to the self-report data

collected by interview questions, it was necessary to collect some objective, visible data of 

 preconscious influences on decision making. One way to do this was to measure how

quickly participants made decisions (decision latency) and offered reasons to support their 

decisions (analysis time), because how quickly decisions were made and reasons were

offered, and how decision latency compared to analysis time, might provide important

evidence that decision making about complex questions is not an entirely conscious

 process, when these response time data were analyzed in connection with the nature and

quality of participants’ evidence, their choice of decision model, and the other data

collected.

Decision latency and analysis time were measured using the interview recordings

and a stopwatch, months after the interviews were completed. A more reliable way to

measure response times in this sort of cognitive task analysis would be to measure them

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 109/282

99

mechanically or electronically, by having participants press a button when they made a

decision, for example. Similarly, galvanic skin response could be measured with skin

sensors, or neurological responses could be measured using real-time brain imaging.

Unfortunately, none of these alternatives were suitable for a legislative sample, so it

 became necessary to do what was possible to collect evidence of preconscious processes.

As discussed further, this included (a) measuring response times with a stopwatch; (b)

asking questions to measure the nature and quality of participants’ information about the

 policy questions, their certainty in the accuracy of their decisions, and their affective

response to the policy questions; and, (c) asking them to think about their own decision

making processes using the model diagrams in Figures 1 and 2. In the end, a variety of 

measures were employed to triangulate whether or not preconscious processes were

operating on participants’ decisions. There were self-report measures of certainty, affect,

and decision making processes, as well as objective measures of response times and

sources of evidence, for example. Together, these measures were designed to elicit

evidence no single measure could produce.

Pilot Study

The purpose of the pilot study was to help select two decision questions for 

interviews with legislators and doctoral students, from the four decision questions

introduced in Chapter II. In particular, the goal was to select the most familiar and the

least familiar decision questions from the four alternatives: Would you oppose or support

legislation to enable [name of state] to provide computers for use in religious primary or 

secondary schools as a means to improve academic achievement?; Would you support or 

oppose legislation to limit class size to 25 students in all [name of state] public schools as

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 110/282

100

a means to improve academic achievement?; Would you support or oppose legislation to

transfer management and control of public schools in your county or legislative district

from the local school board to a private company as a means to improve academic

achievement?; and, Would you support or oppose legislation to change how public

schools are financed in [name of state] so that the existing system, in which local property

tax assessments provide a major source of funding, would be replaced by a statewide

increase in the sales tax, as a means to reduce the disparities in financial resources among

the various counties?

A second objective of the pilot study was to evaluate the revised interview

 protocol that was based on Kuhn’s (1991) work. Because the interview protocol in

Appendix A was prepared for this study, it had not been evaluated in terms of how clear 

the questions would be for the intended participants or how long the interview would take

to administer. Specifically, legislators were asked to allot one hour for their participation

in the study so it was necessary to make certain the interview would be completed in that

amount of time.

Participants

The pilot sample was composed of five adults, two doctors, two lawyers, and one

doctoral student. The pilot participants ranged in age from 32 to 34, with three males and

two females.

Materials

The pilot study interviews were conducted with the interview protocol in

Appendix A. This protocol is discussed in greater detail in connection with the final study.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 111/282

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 112/282

102

 public school finance, but it evoked a stronger reaction from participants than the finance

question. Therefore, the question about privatization was selected for the final study

Second, the pilot study confirmed that the interview could be completed within the

amount of time requested from legislators. No pilot participant took longer than 20

minutes to complete the entire interview (Parts 1 and 2). Finally, pilot participants were

able to understand and answer all of the interview questions, which suggested that the

questions would be appropriate for legislators and doctoral students.

Final Study

Participants

The study sample was composed of two groups of adults, with a total of 59

 participants. The first group consisted of 41 state legislators from two states in the eastern

United States, with 27 male and 14 female legislators, as one of the principal objectives of 

this study was to interview legislators about their decision making processes. Letters were

sent to all the state legislators in various counties in the two states, for a total of about 120

requests. The sample was composed of all the legislators from this group who agreed to

 participate in the study. Of these 41 legislators, 5 did not complete college, 16 completed

college without completing graduate level or professional education, 19 completed a

masters degree or a law degree, and 1 legislator completed an L.L.M., which is a one-year 

legal masters degree following completion of law school. The mean age of legislators was

49.7 years ( N = 37, SD = 12.9), and the mean number of years as a legislator was 8.3 years

(SD = 7.3).

The second group in this study consisted of 18 doctoral students in a college of 

education at a large public university, all female but one. Participants from this group

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 113/282

103

were enlisted through electronic mail requests. One student had recently completed her 

Ph.D. in education, while the other students were working towards this degree. The mean

age of the graduate students was 32.7 years (N = 14, SD = 7.7).

Legislators and doctoral students were included in the sample to increase the

likelihood that the decision questions, which ask about educational policy, were

interesting and meaningful for study participants. Although almost all adults in the United

States are likely to encounter and are entitled to make political decisions that shape public

 policy, not all political decisions are meaningful for or available to all adults given the

wide range of issues. In a representative democracy like the United States citizens

generally only vote on candidates, while elected officials make decisions on specific

 policy issues. Referenda are an exception to this general rule since they enable voters to

vote directly on specific policy questions, but in any election voters face only a small

number of referenda, if there are any. Further, not all adults are registered to vote and not

all registered adults vote.

Since all adults are not well-informed about political issues (Bartels, 1996; Lau &

Redlawsk, 2001) and many political questions are likely not relevant or meaningful for all

adults, a sample drawn from the general adult population would not have been appropriate

for this study of how policy decisions are made, simply because not all adults make

 political decisions and few adults make specific policy decisions on a regular basis. To

ensure that the policy questions studied were important to study participants, state

legislators and doctoral students in a college of education were selected for the study

sample.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 114/282

104

Legislators are part of the sample for another reason. They are political decision

makers who face and make decisions on thousands of policy issues annually. Educational

 policy is also very important to voters and interest groups at the state level, so these

decisions have political consequences for legislators. As a result, it was assumed they

would have the knowledge, motivation, and skills necessary to answer questions about

educational policies and to treat the decision questions in the study as they would treat the

same questions in the legislature. There are at least two additional reasons for 

interviewing legislators about their decision making, although these reasons are less

relevant to the research questions in this study. First, state legislators have considerable

influence over public education. It can be argued that state legislators have greater 

influence over public education than any other group, so it is important to study how they

make educational policy decisions and what they think about certain policy issues.

Second, there does not appear to be any study that interviewed legislators about their 

decision-making processes. Thus, the literature on political decision making, and decision

making more generally, would be enhanced by the direct study of this relevant and

important population.

Doctoral students in education are among the participants because their inclusion

allowed an initial investigation into how prior knowledge and experience in matters of 

 public policy generally and educational policy in particular bear upon decision-making

and reasoning processes (Kuhn, 1991). Few sample groups are likely to have their 

decision-specific information on educational issues. So while legislators have experience

with the political process, they are not necessarily well-informed about educational

matters given the diversity of issues they face each legislative session. In other words,

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 115/282

105

legislators must be generalists in matters of policy, except in those areas where committee

membership or personal experience informs them on specific issues. Doctoral students in

education, by comparison, have made a professional commitment to study educational

issues, and it stands to reason that they would have considerable background knowledge

of educational policy questions.

In sum, comparing the decisions and interview responses of the two groups could

 provide evidence of how knowledge shapes the decision-making process about complex

questions, and the ways in which the decision-making process varies within individuals,

 between individuals, and between groups. Comparing legislators and doctoral students

made it possible to go beyond the first research question about preconscious influences

and to investigate the role of knowledge and experience in decision making.

Materials

Decision Questions

All legislators and doctoral students were asked to decide whether they would

support or oppose a legislative proposal to limit class size in all public schools to 25

students and a legislative proposal to transfer control of public schools to a private

company. These two decision questions were selected following the pilot study. The

 precise language of both decision questions follows:

1. Would you support or oppose legislation to limit class size to 25 students in all

[name of state] public schools as a means to improve academic achievement?

2. Would you support or oppose legislation to transfer management and control of 

 public schools in your county or legislative district from the local school board to a

 private company as a means to improve academic achievement?

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 116/282

106

Interview Protocol

The interview protocol used in this study (Appendix A) was based on the protocol

developed by Kuhn (1991; Appendix B). The protocol used here was derived from Kuhn’s

 protocol because her study is the only one that has investigated adults’ theories and

reasoning about complex social problems with an in-depth interview, which made her 

work a model for the present study. While the interview protocol was based on Kuhn’s,

changes were necessary to make the protocol more suitable to the specific research

questions and participants in this investigation. These changes were guided by the

research questions, conversations with dissertation committee members, and

correspondence with a researcher who had considerable experience studying informal

reasoning.

For example, after reading Kuhn’s (1991) protocol to several adults and receiving

feedback on the number and tone of the questions, it did not seem appropriate to ask 

legislators Kuhn’s 24 questions about the first decision question, repeat the process for the

second decision question, and then ask about their choice of decision model. Additionally,

this study concentrated on preconscious influences on participants’ decisions and none of 

Kuhn’s questions were designed to investigate these influences.

Ultimately, some of Kuhn’s questions were retained and others added to test the

central hypothesis of this study about the influence of preconscious processes on decision

making. The eight questions and related probes in Part 1 of the interview protocol were

drafted to collect data on participants’ response times, evidence, counterarguments,

certainty, epistemological understanding, self-assessed knowledge, affective response, and

reported speed to decision. Each of the variables is discussed in the variables section.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 117/282

107

The language of the questions drawn from Kuhn’s protocol were also revised

 because the original questions were drafted to elicit participants’ causal theories, not

decisions about specific policies, and because it was important that participants in the

 present study feel less like subjects in an experiment and more like participants in a

conversation about educational policy. So, for example, Kuhn’s (1991, p. 299) first

question on the issue of recidivism asked, “What causes prisoner’s to return to crime after 

they’re released?” Since this study did not investigate causal theories, this question was

not appropriate for the present study, even if the topic of the question was changed to ask,

for instance, “What causes people to propose that class size be limited to 25 students?”

Instead, the decision question in this study asked whether participants would support or 

oppose proposed legislation to limit class size and the first interview question was, “Why

would you [support/oppose] such legislation?”

Variables

This section describes the variables measured in this study. Table 1 summarizes

these variables with brief descriptions, coding details, data analyses conducted and

whether the variable was based on Kuhn’s (1991) work. Some of the variables

(justification, decision latency, analysis time, counterargument latency, partisan latency,

and reported speed to decision) measure participants’ response time or perception of 

response time. Other variables measure the content and quality of the information

 participants offer in connection with their decisions. These variables (i.e., citing evidence,

 justificatory rationale, counterarguments, expert knowledge, and argument repertoire)

index how and how well respondents explain and support their public policy decisions to

help determine, among other things, the extent to which participants’ decisions are

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 118/282

108

 products of their conscious reasoning or their conscious reasoning is a product of their 

decisions. The remaining variables measure participants’ certainty in their decision

(certainty), affective response to the decision question (affect), rating of how much they

know about the decision topic (self-assessed knowledge), and choice of decision making

model. The development of the coding schemes for these variables is discussed in

Appendix C.

Response Time: Decision Latency, Analysis Time, Counterargument Latency, and

Partisan Latency

“Decision latency” measured (in seconds) the amount of time from the end of the

decision question posed by the interviewer to the statement of the decision to support or 

oppose the proposed legislation by the participant (e.g., a decision was made when the

 participant said “oppose,” “support,” “yes,” or “no”), or a statement that made clear that

the participant had decided to support or oppose the legislation even though the words

“support” or “oppose” were offered subsequently. So, for example, Legislator 4 responded

to the decision question about whether he would support or oppose a proposal to limit

class size to 25 students as follows: “[ pause] you know [ pause] I would likely at the state

level oppose it . . .” Listening to the interview, it was decided that this legislator’s

deliberation ended before he began the phrase “I would likely at the state level oppose it.”

Therefore, decision latency was measured from the end of the interviewer’s decision

question to the beginning of that phrase. The stopwatch began when the interviewer 

finished his question and stopped it before the participant said “I.”

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 119/282

Table 1

Descriptions of Variables, Coding Details, Data Analyses, and Identification of Variables Influenced by Kuhn (1991)

Variable and Subcategories Variable Description Coding Details Data Analysis Kuhn’s (1991)

Original Variableand Subcategories

Decision Latency The amount of time (in seconds)

that elapsed from the end of the

decision question to the

 beginning of participant’s

statement of a decision

Time measured in whole

seconds (e.g., decision

latency was coded as 2

seconds for any measured

elapsed time between 2.00

to 2.99 seconds)

Mean, Standard

Deviation, Paired

Sample t-test,

Correlation

 None

Analysis Time The amount of time (in seconds)

that elapsed from the end of 

Question 1 (of Part 1 of the

interview protocol in Appendix

A, unless otherwise specified) to

the beginning of participant’sstatement of the first reason for 

the decision

Time measured in whole

seconds (e.g., analysis time

was coded as 2 seconds for 

any measured elapsed time

 between 2.00 to 2.99

seconds)

Mean, Standard

Deviation, Paired

Sample t-test,

Correlation

 None

Counterargument Latency The amount of time (in seconds)

that elapsed from the end of 

Question 2 to the beginning of 

 participant’s statement of a

counterargument

Time measured in whole

seconds (e.g.,

counterargument latency

was coded as 2 seconds for 

any measured elapsed time

 between 2.00 to 2.99

seconds)

Mean, Standard

Deviation, Paired

Sample t-test,

Correlation

 None

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 120/282

Table 1 continued 

Variable and Subcategories Variable Description Coding Details Data Analysis Kuhn’s (1991)

Original Variable

and Subcategories

Partisan Latency The amount of time (in seconds)that elapsed from the end of 

Question 7 to the beginning of 

 participant’s statement of a

decision about whether the

 proposed legislation was a liberal

or conservative position

Time measured in wholeseconds (e.g., partisan

latency was coded as 2

seconds for any measured

elapsed time between 2.00

to 2.99 seconds)

Mean, StandardDeviation, Paired

Sample t-test,

Correlation

 None

Justifications The number of justifications

 participant offered in response to

Question 1 and the follow-up

 probe; also referred to as

“reasons”

Counted the number of 

discrete justifications

 participant offered

Mean, Standard

Deviation, Paired

Sample t-test,

Correlation

 None

Citing Evidence

External Evidence

Personal Evidence

Nonevidence

Classified the source of the

evidence participant offered

(Question 1 and the follow-up

 probe)

 None Frequency,

Percentage

Evidence (genuine

evidence,

 pseudoevidence,

nonevidence)

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 121/282

Table 1 continued 

Variable and Subcategories Variable Description Coding Details Data Analysis Kuhn’s (1991)

Original Variable

and Subcategories

Justificatory RationaleControlling Law

Professional Publication

General Publication

Data

Professional Experience

Personal Experience

Vague

If the evidence offered inresponse to Question 1 and the

follow-up probe was classified as

external or personal, it was

classified more narrowly into one

of the categories of this variable

 None Frequency,Percentage

Evidence (genuineevidence,

 pseudoevidence,

nonevidence)

Counterarguments

Specific

Relevant

Unsuccessful

Nonattempt

Classified the counterarguments

 participants generated against

their policy decision (Question 2)

 None Frequency,

Percentage

Counterarguments

(successful,

alternative theory,

unsuccessful,

nonattempt)

Certainty

Certain

Somewhat Certain

Somewhat Uncertain

Not Certain

Measured participant’s certainty

about their policy decision

(Question 3)

Coded as follows: 0 (not

certain), 1 (somewhat

uncertain), 2 (somewhat

certain), 3 (certain)

Mean, Standard

Deviation, Paired

Sample t-test,

Correlation

Certainty (low,

medium, high,

very high)

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 122/282

Table 1 continued 

Variable and Subcategories Variable Description Coding Details Data Analysis Kuhn’s (1991)

Original Variable

and Subcategories

Expert KnowledgeEvaluative

Multiplist

Absolutist

Classified participant’s view of expert knowledge (Question 4

and the follow-up probe)

The coding scheme wasidentical to Kuhn’s (1991)

 Not analyzed Epistemologicalunderstanding

(evaluative,

multiplist,

absolutist)

Self-Assessed Knowledge Measured how much

 participants’ said they knew

about the decision topic on a

scale from 0 to 4, 4 being highest

(Question 5)

If participant had not

thought about or discussed

the decision topic

 previously, self-assessed

knowledge was coded as 0;

otherwise, it was coded as

the number participant

offered to rate knowledge

Mean, Standard

Deviation, Paired

Sample t-test,

Correlation

Knowledge

compared to the

average person

(more, same, less)

Affect Measured whether participant

reported any feelings, ideas or 

images in response to the

 proposed legislation (Question 6)

Coded as “yes” or “no” Frequency,

Percentage

 None

Reported Speed to Decision

Instantaneously

Quickly

Deliberately

Slowly

Measured how quickly

 participant reported making their 

 policy decision (Question 8)

Coded as follows: 0

(slowly), 1 (deliberately),

2 (quickly), and 3

(instantaneously)

Mean, Standard

Deviation, Paired

Sample t-test,

Correlation

 None

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 123/282

Table 1 continued 

Variable and Subcategories Variable Description Coding Details Data Analysis Kuhn’s (1991)

Original Variable

and Subcategories

Argument Repertoire Total number of justificationsand counterarguments a

 participant generated in response

to Questions 1 and 2

 None Mean, StandardDeviation, Paired

Sample t-test,

Correlation

 None, thisvariable was

drawn from

Cappella, Price

and Nir (2002)

Choice of Decision Model

Traditional

IDMR 

Measured participants’ choice of 

decision model to describe how

most people and how they

themselves made political

decisions (Questions 2 and 3 of 

Part 2 of the interview)

Participants viewed the

model diagrams in Figure 1

and Figure 2 before

answering questions about

the models

Frequency,

Percentage

 None

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 124/282

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 125/282

115

As with analysis time, “counterargument latency” and “partisan latency”

measured (in seconds) the amount of time from the end of the relevant interview question

to the beginning of the first phrase or sentence in which participant offered his or her 

response to the interview question. Because stating a decision (decision latency) with one

word, say “oppose,” inevitably takes less time than stating the reason or reasons (analysis

time) to explain or justify that decision, given that expressing a reason or reasons involve

saying more words than a decision to support or oppose, it was necessary to measure

analysis time, as well as counterargument latency and partisan latency, from the

 beginning of the sentence or phrase in which the participant’s first reason was expressed.

There is no reason to believe that converting thoughts to language takes longer for the

decision than for the reasons, but it was critical that decision latency and analysis time be

comparable. As explained earlier, the method for measuring response times was

imperfect, in large part because it was not possible to connect legislators to a measuring

device, but the method employed to measure response times was a valid way to compare

how long it took participants to make a decision and then to offer reasons for that

decision.

It was hypothesized that for one or both decision topics the decision would come

significantly more quickly than the reasons. If this pattern arose it would suggest that

reasoning followed decision making and would support the hypothesis that the decision

and the supporting reasons were products of separate cognitive processes, one

 preconscious and one deliberate, as proposed by Epstein (1990) and Zajonc (1980),

contrary to existing models of political decision making. If reasoning caused and

 preceded decisions in all cases, then reporting a decision should take longer than

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 126/282

116

reporting the reasons that led to that decision, since the time it would take to report a

decision would include the time it takes to generate reasons, evaluate reasons, and make

a decision. If, on the other hand, a decision was intuitive the decision would take less

time than the conscious process of generating reasons to explain it.

Justifications

This variable measured the number of discrete justifications participants offered

in response to Question 1 and the follow-up probe to explain their policy decisions.

Justifications are also referred to as “reasons.” The justifications measured were very

similar to what Kuhn (1991) referred to as arguments in recommending that thinking

should not be conceptualized as problem solving but rather as argument. In other words,

“much of the thinking we do, certainly about issues that are important to us, involves

silently arguing with ourselves–formulating and weighing the arguments for and against

a course of action, a point of view, or a solution to a problem” (Kuhn, 1991, 2).

Participants’ justifications were the reasons or arguments they gave to explain their 

decisions. Often, participants repeated the same argument in different words, so the

challenge in coding justifications was to distinguish between a new argument and a

redundant one.

Once participants made their decision, unless they had independently offered their 

 justifications or reasons for their decision, they were asked why they would support or 

oppose (depending on their decision) the proposed legislation (Question 1). If they did

not offer specific grounds for their decision, they were then asked the follow-up probe. In

response to these two questions, participants explained their decisions with justifications

(or reasons). The evidence (see “citing evidence” and “justificatory rationale”) they

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 127/282

117

offered in support of the decision or their justification of the decision is discussed in

subsequent sections.

As an example of how “justification” was coded, Legislator 2 said the following

in response to the follow-up probe:

[M]y reading of literature which I have to admit is largely confined on this issue

to newspapers like the New York Times, the Washington Post, Wall Street

Journal, things like that Christian Science Monitor tends to lead me to believe that

lower class size creates an intimacy between the teacher and the student, creates a

 better learning environment, a significantly better learning environment and I

think there have been some studies that have correlated lower class size with

 better productivity on the standardized test and things like that. Could there be

other studies going the other way there always are, [ pause] but at this point my

sense of the data is that its significantly assists in the matriculation process and

think it would be a good idea.

There are three discrete justifications in this response. The first was that lower 

class size “creates an intimacy between the teacher and the student” (L2). The second

was that lower class size “creates a better learning environment,” and the third was that

“there have been some studies that have correlated lower class size with better 

 productivity on the standardized test[s]” (L2). The first and second justifications were

coded separately, even though they are almost redundant, because the second justification

could be interpreted as Legislator 2 saying that lower class size creates a better learning

environment for reasons other than creating an intimacy between the teacher and the

student. For example, the student could feel more comfortable around peers because of 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 128/282

118

lower class size. Because Legislator 2 mentioned specific newspapers as the sources of 

his evidence, “citing evidence” was coded as “external evidence” and “justificatory

rationale” was coded as “general publication.”

Citing Evidence

This variable was designed to measure whether participants relied on external or 

 personal evidence in making or in supporting their decisions. Interview Question 1

(“Why would you [support/oppose] such legislation?”) and the follow-up probe (asking

whether participant’s decision was based on any specific studies, committee reports, or 

 personal experience) elicited participants’ evidence for their decisions. The evidence

 participants cited was coded into one of the following categories: external evidence,

 personal evidence, or nonevidence. The type of evidence offered in response to

Question 1 was measured separately from the type of evidence offered in response to the

 probe, since the follow-up probe prompted participants to offer specific types of 

evidence. For purposes of calculating “argument repertoire,” the total number of 

 justifications a participant offered in support of her decision is based on the number of 

reasons offered both in response to both Question 1 and the follow-up probe.

“External evidence” is defined as evidence in support of a policy decision that

was relevant to the decision, could lead to or cause the decision, and was based on

something more than personal experience alone, for instance, citing as evidence an

empirical study, a published article, testimony in committee, committee reports, position

statements from interested parties, or course work on the decision topic. For instance, in

explaining why he would oppose the proposal to privatize public schools in his

legislative district, Legislator 12 said, “I have read lots of information regarding the most

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 129/282

119

effective way to control a local public school system.” This is external evidence. In terms

of justificatory rationale, this would be coded as “data” because he did not cite the

specific source of this external information.

“Personal evidence” is defined as a reason or as evidence in support of a policy

decision that was relevant to the decision, could lead to or cause the decision, and

was based on personal experience, values, principles, or beliefs without mention of an

external source of support or confirmation that would qualify under the definition of 

external evidence. Again using Legislator 12 as an example, in opposing privatization he

explained, “I am convinced that local control of the delivery of public education is an

essential component of the success of the local school system.” This argument is based

on “personal evidence” because it is a statement of the legislator’s belief without any

reference to an external source of support for the belief. Legislator 12 offered this

 personal evidence in response to Question 1 and then he offered the external evidence

cited in the prior paragraph after he was asked the follow-up probe for specific evidence

to support his decision.

Finally, “nonevidence” is defined as any answer offered by a respondent that did

not provide any coherent evidence or reasons to support the policy decision, implied that

evidence is unnecessary or irrelevant, or offered a response that did not qualify as

external evidence or personal evidence.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 130/282

120

Justificatory Rationale

If a participant offered external or personal evidence in response to Question 1 or 

the follow-up probe, the variable “justificatory rationale” served to further classify the

sources of participant’s evidence. The evidence offered in response to Question 1 was

measured separately from the evidence offered in response to the follow-up probe,

 because legislators often offered different types of evidence in response to Question 1

and the subsequent probe. For example, as described in the discussion of “citing

evidence,” for the same decision Legislator 12 offered personal evidence in response to

Question 1and external evidence in response to the subsequent probe. Justificatory

rationale consists of the following seven categories or sources of support to classify the

external and personal evidence participants offered in support of their decisions and

reasons for the decision.

“Controlling law” encompasses a relevant law or regulation that governs the

decision. For example, Legislator 5 cited his state’s “constitutional mandate to fund an

adequate and equitable education” in opposing the proposal to privatize public schools.

“Professional publication” includes a peer-reviewed study, a report by legislative services

or committee staff, or a published article in an education-specific publication (e.g.,

 Education Week ). “General publication” refers to an article in a newspaper, magazine or 

other general publication or a position statement by an interested party. “Data”

encompasses statements by a participant that refer to an external source of support for his

or her decision without offering specific information about that source, so that the source

cannot be classified as controlling law, professional publication or general publication.

So, for instance, if a participant says that she has read studies in support of smaller class

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 131/282

121

sizes, but does not cite where she read about the studies, that source of support is

classified as “data.”

Controlling law, professional publication, general publication and data are

classified as external evidence. “Professional experience” covers participant’s experience

in public education as a teacher, school board member, member of an education

committee, education lobbyist, post-graduate coursework in education, or any other 

 professional experience focusing on issues of public education. Professional experience

is coded as external evidence if a legislator reports that the decision to support smaller 

class sizes, for example, is based on testimony she heard in a committee hearing on the

issue, but is coded as personal evidence (along with the next category of personal

experience) if the legislator reported supporting smaller class sizes because when she

worked as a teacher it was easier to manage a smaller class. “Personal experience” refers

to support or explanations that a participant offers for his or her decision based on

experience that does not qualify as professional experience, including personal

 principles, personal values or beliefs, which include partisan ideological positions,

feelings, and heuristics (i.e., generally accepted beliefs, truisms, catch-phrases). The last

category of justificatory rationale covers “vague” sources of support or reasons,

including imprecise statements.

Counterarguments

Question 2 of the interview protocol (“Suppose now that one or more colleagues

disagreed with your decision regarding this legislation. What evidence might they give or 

what arguments might they make in [opposing/supporting] the legislation?”) elicited

 participants’ counterarguments. A counterargument would consist of evidence or 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 132/282

122

arguments that a colleague who disagreed with participant would offer in support of the

 position opposing participant’s position on the proposed legislation. In other words, a

counterargument is an argument or evidence offered to oppose participant’s decision.

Counterarguments were coded into one of four categories: specific, relevant,

unsuccessful, or nonattempt. “Specific” counterarguments were those directed at whether 

or how participant’s or opponents’ policy decision would improve academic achievement

in public schools. For example, Legislator 22 supported the proposal to limit class size to

25 students in all public schools. When asked what evidence or arguments a colleague

who disagreed might offer, Legislator 22 said they might argue that there is no proof that

the number should be 25 instead of 28 or 30. This counterargument was coded “specific”

 because it is directed at the issue of whether a 25-student limit would actually increase

academic achievement.

“Relevant” counterarguments concern the fiscal or political feasibility or 

consequences of the participant’s or opponents’ policy decision about the proposed

legislation, but with no reference to whether or how the decision would improve

academic achievement in public schools. Legislator 22 also offered an example of a

relevant counterargument by citing budgetary constraints as evidence against his decision

to support the class size limit.

“Unsuccessful” attempts to generate counterarguments were those where

 participant tried to offer a counterargument but failed to offer a specific or relevant

counterargument. A “nonattempt” occurred when a participant was unwilling or unable to

offer a counterargument. For purposes of calculating “argument repertoire,” the total

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 133/282

123

number of counterarguments a participant generated was based on the number of specific

and relevant counterarguments.

Certainty

Certainty measured how sure participants were of their decisions. Participant

certainty was coded on a scale from 0 to 3–0 (not certain), 1 (somewhat uncertain), 2

(somewhat certain), and 3 (certain)–based on participants’ response to interview

Question 3 (“How sure are you that your decision regarding the legislation is correct?

 Not certain, Somewhat uncertain, Somewhat certain, or Certain?”). The answer to this

question was hypothesized to be the product of an affective signal in some instances. In

other words, participants would not assess how certain they were about a decision based

on the quantity and quality of information they could recall or had collected to support

their decision. Instead, they would assess how certain they were about their decision

 based on how certain they felt . This hypothesis would be supported if a participant was

certain about a decision without being able to cite decision-specific information to

support the decision.

Expert Knowledge

This study adopted Kuhn’s (1991) three-category scheme for evaluating

 participants’ epistemological understanding. Based on their responses to Questions 3

(concerning certainty) and 4 (“Do you think policy experts know for sure what the

correct decision about the legislation is? If no Would it be possible for experts to find out

for sure if they studied this problem long and carefully enough?”), participants were

classified in one of three categories: absolutist, multiplist, and evaluative. Absolutists

claim that “experts either do, or can with sufficient study, know with certainty the

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 134/282

124

causes” (Kuhn, 1991, pp. 173-174) of the complex real-world phenomena Kuhn

investigated. Multiplists deny the possibility of expert certainty and deny the existence of 

certain knowledge, embracing instead radical subjectivity. Finally, those with an

evaluative stance, which is the highest level in Kuhn’s scheme, also “deny the possibility

of certain knowledge . . . however, they regard themselves as having less certainty with

respect to the question than would an expert on the topic” (Kuhn, 1991, p. 187). For 

 present purposes, since participants were not asked for their certainty relative to experts,

 participants were ranked “evaluative” if they denied the possibility of certain knowledge

 but acknowledged in some way that experts could know more or be more certain than

those with less information, in other words, that knowledge on the topic mattered.

Self-Assessed Knowledge

This variable measured how participants rated their knowledge about the decision

topic, based on their response to Question 5 (“Have you ever considered or discussed this

 proposal with anyone before today? If yes How knowledgeable would you say you are

about this proposal, on a scale from 0 to 4, with 0 representing no prior knowledge and 4

representing expertise?”). If participants had not considered or discussed the proposal

 previously, their self-assessed knowledge was coded as 0. If they had, their self-assessed

knowledge was the number they used to score their knowledge about the decision topic.

Affect

“Affect” recorded as a “yes” or “no” whether participants reported an affective

response to the decision question in their answer to interview Question 6 (“When I first

asked you this question about this legislation, did it bring to mind any positive or 

negative feelings, ideas or images? If yes What were those feelings, ideas or images?”)

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 135/282

125

and what the nature of the response was. Self-report measures like certainty and affect

were analyzed in connection with other self-report measures and with objective measures

of response time and the sources and quality of evidence to reach conclusions about

whether or not preconscious processes influenced participants’ decision making.

Reported Speed to Decision

In addition to measuring response times, participants were asked to rate how

quickly they made their policy decision, at the end of the interview following the

decision. Question 8 (“Looking back, how quickly did you make your decision?

Instantaneously, Quickly, Deliberately, or Slowly?”) was drafted to allow a comparison

 between measured latencies and participants’ reports. Reported speed was coded from 0

(slowly) to 3 (instantaneously). If participant’s response suggested the operation of an

overall evaluative tally, that response was not coded on this scale. Therefore, when a

 participant answered Question 8 by saying that the decision as part of the interview was

quick or instantaneous but that the decision was the result of deliberation over time prior 

to the interview, for example, it was recorded as evidence of the operation of an overall

evaluative tally and was not coded on the 0 to 3 scale.

Argument Repertoire

“Argument repertoire” is a measure of opinion quality that Cappella, Price, and

 Nir (2002) created, based on Kuhn (1991), for use in political survey research. Argument

repertoire was a total score for each individual consisting of the number of justifications

offered, plus the number of counterarguments offered.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 136/282

126

Choice of Decision Model

Participants were shown the traditional model of reasoning and decision making

(Figure 1) and the intuitive model of decision making and reasoning (Figure 2) and were

asked to decide which model more accurately described how most people made political

decisions and then how they personally made political decisions. In addition to their 

choice of model, what they said in connection with their choices revealed how they

thought about their decision making and the decision making of others.

Procedure

The author interviewed all participants individually and in-person using the

instructions and interview protocol in Appendix A. The interview was recorded using a

digital voice recorder. Legislative interviews were conducted in legislators’ state or 

district offices, their homes, or in some other mutually-convenient location. This

 procedure permitted the interviewer to ask legislators educational policy questions of the

sort they make in the legislature in the settings in which they actually make such

decisions. By asking the questions in person, it was possible to hold participants’

attention for the duration of the interview and to record the interview. Also, lobbyists and

other interested parties often solicit legislators’ support on specific legislation in face-to-

face meetings. This procedure of this study attempted to approximate those conditions,

with the obvious exclusion of any efforts by the interviewer to persuade the legislators of 

any particular position. Doctoral students were interviewed in offices in a college of 

education. Since students do not make political decisions on a regular basis in a specific

 place, the location of student interviews was not as important as it was for legislators.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 137/282

127

The interviews proceeded as follows. After the opening instructions participants

were asked to make a decision on one of the two policy questions. After the participant

made a decision, they were asked the questions in Part 1 of the protocol in Appendix A

concerning their evidence, counterarguments, and certainty among other things. This

 procedure was repeated for the second decision question. Decision order was

counterbalanced so that some participants answered the class size question first, while

others answered it second. After the interview relating to the two policy decisions was

completed, participants were asked the questions in Part 2 of Appendix A. They were

first asked a general question about educational policy. Then they were asked to review a

diagram of the traditional model of reasoning and decision making (Figure 1) and of the

intuitive decision making and reasoning model (Figure 2) while the interviewer described

the differences between the two models. Participants were then asked to select which

model more accurately described how most people make political decisions and then how

they themselves made political decisions.

Measuring Response Times

Measuring how long it took participants to make a decision and to offer support

for that decision was a critical element of the data analysis in this study. While listening

to and coding legislative interviews, it became obvious it was also worth measuring how

long it took legislators to generate counterarguments and to decide whether the proposed

legislation was liberal or conservative, because it seemed to take legislators longer to

answer these questions than it took to make the initial policy decision. However,

measuring decision latency and analysis time turned out to be much more difficult than

expected.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 138/282

128

The procedure for timing was as follows. With headphones on, the author listened

to the interview recording. When the interviewer finished asking the decision question,

the stopwatch was started and it was stopped when a participant said yes, no, support, or 

oppose for example. This was decision latency. For analysis time, the stopwatch started

when the interviewer finished asking Question 1 and stopped when the participant began

the first phrase or sentence in which participant offered a reason to explain his or her 

decision. Of course, knowing when that phrase or sentence began required repeated

listening. The watch was stopped at the beginning of the phrase or sentence in which the

reasons was reported because to measure any more time would make it inappropriate to

compare decision latency and analysis time. Given that more words are required to

explain a decision than to state a decision to support or oppose, it takes longer to actually

voice a justification than to voice the words “yes,” “no,” “support” or “oppose.” Unless

analysis time was measured to the beginning of a statement of justification, analysis time

would be exaggerated and any differences between decision latency and analysis time

would be meaningless. The same was true for counterargument latency and partisan

latency which were measured in the same way, starting the watch at the end of the

interview question and stopping it at the beginning of the word or phrase that answered

the question.

To make sure time was measured correctly, this procedure was repeated at least

twice for each participant and for each variable. Time gaps were measured in whole

seconds. In those instances where a response came during or immediately after the

interviewer’s question, the gap (if there was one) was coded as zero seconds. A gap

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 139/282

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 140/282

130

deliberation on the question, the participants settled upon their decision. Since the

decision followed reasoning about the question, and it was not possible to ask Question 1

separately from the decision question, an identical time was recorded for both decision

latency and analysis time.

The fourth scenario involved those participants who asked the interviewer 

questions about the decision question after it was asked. Because they did not make a

decision until they had asked the interviewer one or more questions about the decision, it

was not possible to measure decision latency in this case because it was not clear when to

start and stop the stopwatch. As a result of this fourth scenario, there were no data for 

certain participants on decision latency or analysis time.

After listening to the interviews, another issue became clear regarding decision

latency. Given how quickly legislators made their decisions, it is likely they began to

make a decision about each proposal once the interviewer spoke the phrase “class size”

or “private company,” but because the question continued beyond these phrases, the

 beginning of decision latency was measured from a later point in time, when the question

was completed. As a result, the amount of time it took participants to make a decision

may actually be longer than measured by decision latency.

Interrater Agreement

To evaluate the coding schemes for the variables measured in this study, another 

rater, an advanced doctoral student in Human Development, coded the data from selected

 participants. Training consisted of an explanation of the variables to be measured,

 presentation of relevant and prototypical examples, and the illustration of the coding

scheme for each variable. As part of the training, the second rater used the coding scheme

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 141/282

131

to code the data for one randomly selected participant. Codings were then compared for 

agreement and any differences.

After training was completed, the second reviewer coded the data for six

randomly selected legislators, so that interrater agreement could be calculated based on

10 percent of the participants. Given two decisions for each participant and the number of 

variables measured, there were 194 points of possible agreement. Interrater agreement

was calculated by dividing the total number of points on which we agreed (164) by this

total amount, which resulted in interrater agreement of 85 percent.

Once the second rater had completed all codings and interrater agreement was

calculated, we sat down to go through each of the six transcripts. We discussed the bases

for our respective coding decisions until all disagreements were resolved.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 142/282

132

CHAPTER IV

RESULTS AND DISCUSSION CONCERNING

PRECONSCIOUS INFLUENCES ON DECISION MAKING

This chapter presents the data most relevant to answering the first research

question: “Do the decisions of state legislators and doctoral students in a college of 

education about two educational policy issues, and their responses to interview questions

about their reasoning on those issues, provide evidence of the influence of preconscious

 processes on decision-making and reasoning about policy issues?”

Evidence to respond to the first research question came from four primary

sources. The first of these was participants’ response times: how much time did they take

to make a decision (decision latency) and to report the reasons for that decision (analysis

time)? How did decision latency compare with counterargument latency and partisan

latency? The second source of evidence of preconscious processes was participants’ self-

assessed knowledge and certainty about, and affective response to, the decision

questions. How certain were participants about their decisions and how did their certainty

relate to their self-assessed knowledge? Did the decision topic evoke an affective

response? The third source of evidence was the nature and quality of participants’

reasoning about each decision and about their own decision-making process. In other 

words, what type of evidence did participants offer in support of their decisions and what

was the source and quality of their rationale? The final source of evidence of 

 preconscious influences on decision making was participants’ choice of decision models

and their comments about these models and their own decision-making processes. A

second analysis of this final piece of evidence, what participants said about the decision

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 143/282

133

models and how they elaborated upon the models, will be treated separately in Chapter 

VI.

The variables discussed in this chapter, along with the relation among them,

include: response time (i.e., decision latency, analysis time, counterargument latency, and

 partisan latency), reported speed to decision, citing evidence, justificatory rationale,

 justifications, certainty, affect, self-assessed knowledge, and choice of decision model.

Throughout this document, comparisons between the results for legislators and graduate

students are made descriptively and not statistically.

Table 2 presents the results for all the quantitative variables measured in this

study. The table includes the data for both decisions and for both sample groups, which

makes it possible to compare how legislators' responses for the class size decision

compared to their responses for the privatization decision, how graduate students'

responses for the class size decision compared to their responses for the privatization

decision, and how legislators' responses for one or both decisions compared to graduate

students' responses. The table also shows where there were significant differences

 between the mean value of a variable for the class size decision and the mean value of 

that same variable for the privatization decision based on paired samples t-test analyses.

So, for example, as shown in Table 2 the difference between legislators' analysis time for 

the class size decision and the privatization decision was significant, t (37) = -2.40, p =

.02 (two-tailed).

There were missing data points for certain individuals on certain variables. In

those cases where there were missing data points, a question may not have been asked or 

a participant’s answer may have been unresponsive or unclear. Means, standard

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 144/282

134

deviations and significance were calculated based on available data points. It is also

worth noting that for graduate students Table 2 presents unadjusted and adjusted decision

latencies and analysis times. The unadjusted values were based on the data from all the

graduate students for whom it was possible to measure decision latencies and analysis

times. Adjusted values were calculated after the data from several students with

especially lengthy response times were excluded from the calculation of means. Unless

otherwise noted, the analysis of the graduate student data is based on the unadjusted

values.

Response Times and Reported Speed to Decision

Legislators made complex and in some cases novel educational policy decisions

almost instantaneously. They offered explanations for their decisions almost as quickly.

Table 2 shows that for legislators mean decision latency for the class size decision was

1.36 (SD = 2.24) seconds and mean analysis time was 1.84 (SD = 2.75) seconds while

decision latency for the privatization decision was 1.87 (SD = 2.67) seconds and analysis

time was 3.55 seconds (SD = 4.58). For both decisions mean decision latency was shorter 

than mean analysis time. The difference between decision latency and analysis time for 

the privatization decision was significant for legislators, t (37) = -2.40, p = .02. These

results run counter to existing models of decision making, which posit that decisions are

 produced by and come later-in-time than conscious reasoning about the decision

question. If this were the case, mean analysis time for both decisions should be shorter 

than mean decision latency.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 145/282

Table 2

Legislator (Leg.) and Graduate Student (Grad.) Data for Quantitative Variables

Variable  N  Min Max  M SD

CS P CS P CS P CS P CS P

Response times

Leg. Decision Latency 38 41 0 0 9 8 1.36a 1.87 b 2.24 2.67

Grad. Decision Latency 18 16 0 0 93 82 11.33 8.50 22.95 20.57

Grad. Decision Latency (Adjusted) 15 14 0 0 10 6 2.86 2.00 3.41 1.92

Leg. Analysis Time* 39 40 0 0 9 25 1.84c 3.55 b 2.75 4.58

Grad. Analysis Time 17 17 0 1 93 82 11.00 10.47 23.79 19.60

Grad. Analysis Time (Adjusted)* 14 15 0 1 9 17 1.85 4.66 2.56 4.51

Leg. Counterargument Latency 35 39 0 0 12 18 2.17 2.97 2.95 3.83

Grad. Counterargument Latency 18 18 0 0 17 19 2.61 3.38 4.67 5.07

Leg. Partisan Latency 41 40 0 0 18 11 4.73a,c 2.65 4.66 3.10

Grad. Partisan Latency 17 18 0 0 11 8 3.64 2.72 3.21 2.73

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 146/282

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 147/282

Variable  N  Min Max  M SD

CS P CS P CS P CS P CS P

Leg. Self-Assessed Knowledge*** 36 38 0 0 4 4 1.90 0.73 1.55 1.31

Grad. Self-Assessed Knowledge** 18 17 0 0 4 2 1.86 0.41 1.47 0.79

Leg. Reported Speed to Decision 31 35 0 0 3 3 2.25 1.91 0.96 1.09

Grad. Reported Speed to Decision 17 18 0 0 3 3 1.88 1.66 1.16 1.08

 Note. Response times measured in seconds; evidence, reasoning, and counterarguments measured by number of reasons, words, and

counterarguments, respectively. Self-report variables measured on variable-specific scales: certainty measured on a scale from 0 (not

certain) to 3 (certain); self-assessed knowledge on a scale from 0 (no knowledge) to 4 (expertise); and reported speed to decision on a

scale from 0 (slowly) to 3 (instantaneously). CS = class size decision; P = privatization decision.

For certain variables, the differences between the mean values for the two decisions on that variable were statistically significant:*p <

.05. **p < .01. ***p < .001. For legislators: a The difference between decision latency and partisan latency for CS was significant at

the .001 level.  b The difference between decision latency and analysis time for P was significant at the .05 level. c The difference

 between analysis time and partisan latency for CS was significant at the .001 level.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 148/282

138

When asked to describe how quickly they had made their policy decision, 18

legislators (43%) said “instantaneously” for the class size decision and 14 (34%) said

instantaneously for the privatization question. In total, more than half of the legislators

said that they had made their policy decisions quickly or instantaneously (60% for class

size, 56% for privatization). This in itself is not evidence of preconscious influences but

it sustains the hypothesis that decision making about complex questions is influenced by

 preconscious processes, since such processes operate more quickly than conscious

reasoning (Bargh et al., 1996; Epstein & Pacini, 1999; Zajonc, 1980).

Listening to the legislators’ responses during interviews revealed that legislators

did not immediately answer the question of whether the class size issue was better 

described as a liberal or a conservative position. It seemed as though legislators were

thinking more deliberately about the question of conservative and liberal than they were

about the policy decision itself, which was surprising given that the policy decisions were

more complex and should have taken longer to decide if the traditional model held. “Will

reducing class size to 25 students in all public schools improve academic achievement?”

appears to be a more complex question than “Is a proposal to reduce class size to 25

students in all public schools better characterized as a liberal or conservative position?”

 because deciding whether the proposal will improve academic achievement requires the

evaluation of many more variables, processes and consequences, and how these would

interact over time.

Considering partisan latency and counterargument latency, it took legislators less

time on average to decide whether to support or oppose the proposed legislation ( M =

1.36 seconds, SD = 2.24 for class size and M = 1.87 seconds, SD = 2.67 for privatization)

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 149/282

139

than it took them to decide whether the proposed legislation was liberal or conservative

( M = 4.73 seconds, SD = 4.66 for class size and M = 2.65 seconds, SD = 3.10 for 

 privatization). The difference between partisan latency and decision latency for the class

size issue was significant, t (37) = -4.35, p = .000. Counterargument latency was also

longer than decision latency for both topics ( M = 2.17 seconds, SD = 2.95 for class size

and M = 2.97 seconds, SD = 3.83 for privatization).

There is no obvious hypothesis to explain why mean partisan latency would be

longer than mean decision latency. During the interviews, it was apparent that that the

only question from Part 1 of the interview protocol that legislators regularly answered by

deliberating first and then deciding was the question of whether the class size proposal

was liberal or conservative, which is why partisan latency was included as a variable in

this study. For the other interview questions, the legislators’ answers seemed to come

right after the questions were finished. Based on how legislators responded to the various

interview questions, there is reason to believe that mean decision latency would have to

 be at least 5 seconds for both decision topics if legislators were actually thinking

consciously about the decision questions before making a decision (the manner suggested

 by the traditional model). Therefore, mean decision latencies of 1.36 and 1.87 seconds

can be taken as evidence that legislators did not make their decisions in the manner 

suggested by the traditional, purely conscious model of decision making.

This appears to be too little time to bring to mind the consequences that would

follow from support of and from opposition to the proposal, to evaluate these

consequences, including how well they would advance the goal of improving academic

achievement, what their costs would be, and how likely the consequences are to occur, all

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 150/282

140

of which are required by expected utility models, and then to report the decision that

offered the best trade-off between costs and benefits. This doubt was reinforced by the

fact that the class size proposal was novel for almost half of the legislators (41%) and the

 privatization proposal was novel for 3 out of 4 legislators.

Graduate students’ mean decision latencies and analysis times for both decisions

were considerably longer than legislators’ times (see Table 2). Mean decision latencies

were 11.33 seconds (SD = 22.95) for the class size question and 8.50 seconds (SD =

20.57) for the privatization question. Mean analysis times were 11 seconds (SD = 23.79)

for the class size question and 10.47 seconds (SD = 19.60) for the privatization questions.

These are the unadjusted values for graduate student decision latency and analysis time in

Table 2.

One of the reasons for the large difference between legislators’ and students’

response times is that four graduate students deliberated for an extended amount of time

 before making a decision or offering reasons to explain their decision. On the class size

decision there were three students for whom decision latency was coded as 31, 37 and 93

seconds respectively. There were two students for decision latency was coded as 26 and

82 seconds respectively on the privatization decision; one of these students was also in

the first group of three. As explained in Chapter III, because these students reasoned

about the legislative proposals and then made a decision, without the interviewer 

 prompting them to provide the reasons for their decision, decision latency and analysis

time are identical. These lengthy response times had a great impact on mean decision

latencies and analysis times for graduate students.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 151/282

141

If you remove these four students’ decision latencies and analysis times from the

calculation of means, mean decision latency for graduate students drops to 2.86 seconds

(SD = 3.41) and 2 seconds (SD = 1.92) for the two decisions, while mean analysis time

drops to 1.85 (SD = 2.56) and 4.66 seconds (SD = 4.51). These results are shown in Table

2 as “adjusted” values for graduate student decision latency and analysis time. The

difference between adjusted analysis time for the two decisions was statistically

significant, t (11) = -2.78, p = .01 (two-tailed). These adjusted values are much closer to

legislators’ mean decision latency values of 1.36 seconds (SD = 2.24) and 1.87 seconds

(SD = 2.67) and mean analysis time values of 1.84 seconds (SD = 2.75) and 3.55 seconds

(SD = 4.58). Even when compared to the adjusted graduate student values, legislators

made their decisions and offered their rationales more quickly, but now the differences

are not measured in tens of seconds but in hundredths. Chapter V returns to the question

of how to treat the extreme values measured for several graduate students. For purposes

of this chapter, however, the analyses are based on the unadjusted graduate student data

 because there is reason to believe that these four students were representative of some

 portion of the graduate student population.

Decision latency and analysis time were longer for graduate students than for 

legislators, as was their reported speed to decision. Graduates students reported taking

longer to make each decision than legislators did. On a scale from 0 (slowly) to 3

(instantaneously), graduate students’ mean reported speed to decision for the class size

decision was 1.88 (SD = 1.16) and 1.66 (SD = 1.08) for the privatization decision,

compared with 2.25 (SD = .96) and 1.91 (SD = 1.09) for legislators (higher numbers

mean a faster decision). Legislators and graduate students reported taking longer to make

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 152/282

142

a decision on privatization, and while the mean values are greater for legislators, for both

groups the reported speed to decision falls on either side of “quickly” (coded as 2).

Levels of Certainty, Self-Assessed Knowledge, and Affective Response

On a certainty scale from 0 (not certain) to 3 (certain), legislators averaged

certainty of 2.47 (SD = .91) for the class size decision and 2.40 (SD = .95) for the

 privatization decision. For both decisions, more than 63 percent of legislators were

certain they were correct and more than 85 percent were either somewhat certain (coded

as a 2) or certain. For the two decisions, self-assessed knowledge on a scale from 0 to 4

was 1.90 (SD = 1.55) and 0.73 (SD = .73). This difference between self-assessed

knowledge for the two decisions was significant, t (34) = 4.73, p = .000 (two-tailed).

The mean number of justifications legislators offered in support of the class size

decision was 1.93 (SD = 1.03) and for the privatization decision was 2.20 (SD = 1.14).

For the class size issue, the correlations among certainty, self-assessed knowledge, and

number of justifications were not significant. For privatization, however, the correlation

 between self-assessed knowledge and number of justifications (r = 0.33, p = .04) and the

correlation between certainty and number of justifications (r = 0.32, p = .04) were

significant.

That certainty for both issues was almost identical while self-assessed knowledge

was significantly different for the two issues suggests that certainty and amount of 

information were not related, supporting the hypothesis that how certain we are about a

 position can be the product of a feeling of knowing rather than a conscious assessment of 

how much we know. There is further support for this hypothesis in the mean number of 

 justifications which was approximately two justifications for each decision topic. It can

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 153/282

143

 be argued that if certainty is based on a conscious evaluation on the amount of 

information one has for a topic, then two justifications does not constitute sufficient

decision-specific information to support the high level of certainty legislators report for 

 both decisions. Given the data for the class size decision, these hypotheses and

 justifications hold.

The data for legislators on the less familiar privatization decision cannot be

interpreted in this way, however. For this second issue certainty and self-assessed

knowledge were not correlated (r = .22, p = .19 [two-tailed]), but self-assessed

knowledge and number of justifications (r = .33, p = .04 [two-tailed]), and certainty and

number of justifications (r = .31, p = .04 [two-tailed]), were. The significant correlation

 between certainty and number of justifications undermines the hypothesis that certainty is

an affective signal unrelated to how much information one has. One possible explanation

is that because the privatization issue was novel for 3 out of 4 legislators, and because

they were conscious of how little they know about this topic, as reflected in low self-

assessed knowledge, their reported certainty was potentially the product of a conscious

evaluation of how much they know about the issue of transferring control of public

schools to a private company.

For the class size issue legislators reported fewer justifications than for the

 privatization issue but a higher level of self-assessed knowledge and certainty, suggesting

that for the class size issue legislators may not have been consciously aware of how little

they knew. This could explain why certainty was not significantly related to the number 

of justifications (r = .05, p = .75 [two-tailed]), just as self-assessed knowledge was not

significantly related to the number of justifications (r = .18, p = .27 [two-tailed]). In sum,

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 154/282

144

these data suggest that in some cases certainty may have been based on a conscious

evaluation by participants of their state of knowledge on an issue, and in others it may

have been based on an affective sense of how much participants felt they knew.

Finally, on the subject of affective response, for both topics 73 percent of 

legislators reported that the decision question brought to mind positive or negative

feelings, ideas, or images. That the same number of legislators reported an affective

response to both decision questions was unexpected given that the privatization topic was

less familiar and appeared to be more emotionally salient than the class size topic. At the

same time, if the studies cited in Chapter II are correct, then participants should have had

an affective response to every policy question they encountered, whether or not the topic

was provocative, which is consistent with the data collected to measure affect in the

 present study.

In response to interview Question 6 (“When I first asked you this question about

this legislation, did it bring to mind any positive or negative feelings, ideas or images?”)

many of the legislators described positive or negative feelings or images the decision

question brought to mind. As explained in the previous paragraph, the same number of 

legislators answered this question in the affirmative for both decisions. However, in

responding to Question 6, more legislators specifically described these feelings or images

in response to the proposal to privatize.

When faced with this proposal, one legislator had an image of Robocop, the

movie in which municipal police officers were replaced by private contractors and

cyborgs (L2). Similarly, another legislator observed, “If talking about images, when we

started talking about private companies running the schools [ pause] I had this picture of a

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 155/282

145

really clean and well-operated schools like a corporation you know all the grass is cut

 perfectly, the buses are on time you know and yes, exactly it looks like [name of local

conference center and resort] or [ pause] one of these nice corporate you know [name of 

local planned community] or something like that. But then when I thought about the

initial image that came to me about corporate teaching was kind of the feeling of like you

know like former Soviet Union automatons just sitting there and them you know telling

them stuff, everything they wanted to tell them and not telling them the whole story”

(L27).

Also on the privatization question, one legislator described how vigorous the

opposition of teacher’s unions would be, and how serious the consequences would be for 

any Democratic legislator who supported privatization, “So I mean from a sheer political

self-interest standpoint it’s like Oh My God!” (L3). Another legislator was surprised by

the proposal to privatize schools in her school district, and she reported “a few negatives

and I think the reason is, is because it was something that I had never even in my wildest

dreams contemplated before so it was like Oh!” (L9).

Several legislators reported a basic opposition to privatization: “That whole idea

of privatization of that function brings about in my mind a negative feeling” (L4); “I

think it’s just an overall negative response” (L24); “I have a visceral negative response”

(L26); “its just an idea that is abhorrent to me” (L33); “some negative images of the

continued onslaught against public education” (L29); “immediately [the] code word of 

 privatization shot my tentacles up to say ‘Oh God, I’m probably not going to like this’”

(L34); and “a bad gut feeling” (L40).

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 156/282

146

On the class size issue, such comments were less frequent and more benign: “my

first image is the trailers” (L38), or “a mental image of a temporary classroom otherwise

known as a trailer” (L41). In response to Question 6 one legislator explained, “I mean the

only thing that happened to me is like what happens with so many other policy cases of 

Oh God it sounds good but what else should I be thinking about or Oh, it sounds good

 but there are drawbacks in the moment, and you know you feel, just how torn I often feel

that we often have to say no to good policies because of the fiscal situation” (L34). This

is the sort of conflict Epstein (1990) described in connection with his cognitive-

experiential self-theory.

Like legislators, graduate students were almost certain of their decision on the

 proposal to limit class size, but they generated even fewer justifications to support their 

decision than legislators did. For the class size decision, the relation among graduate

students’ certainty, self-assessed knowledge, and the number of justifications was similar 

to the corresponding data for legislators, in terms of there being a high level of certainty

in the absence of abundant decision-specific information to justify it. Where graduate

students and legislators diverged was in their certainty about the privatization issue. So,

while the data collected from graduate students regarding certainty, self-assessed

knowledge and number of justifications supported the conclusion that preconscious

 processes were at work in the class size decision, graduate students responded differently

to the privatization proposal and the follow-up questions. Preconscious processes may

still have shaped how the students’ made their decisions about whether or not to

 privatize, but there was stronger evidence of conscious monitoring by graduate students

in connection with the privatization decision.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 157/282

147

The difference between graduate students’ mean certainty in their class size

decision ( M = 2.46, SD = .63) and in their privatization decision ( M = 1.72, SD = 1.12)

was significant, t (14) = 2.30, p = .04 (two-tailed). The difference between the average

number of justifications graduate students offer for the two decisions ( M = 1.22, SD = .42

for class size and M = 1.66, SD = .68 for privatization) was also significant, t (17) = -2.20,

 p = .04 (two-tailed). Contrary to the data on number of justifications, graduate students’

mean self-assessed knowledge was lower for the privatization decision ( M = 1.86, SD =

1.47 for class size and M = .04, SD = .79 for privatization) and the difference between

these means was significant, t (16) = 3.33, p = .004 (two-tailed). By comparison, the

differences in certainty and number of justifications for the two decisions were not

significant for legislators. So while legislators’ response times for both decisions, and the

absence of a significant correlation between certainty and self-assessed knowledge, point

to the operation of preconscious processes with little evidence of conscious monitoring,

graduate students responded quite differently to the two decisions in terms of how much

they thought they knew and how certain they were.

However, what legislators and graduate students had in common was that on

average both groups reported more justifications for the privatization decision than for 

the class size decision, even though both groups reported lower levels of certainty and

self-assessed knowledge for the privatization decision. This may be evidence that

certainty is the product of a feeling of knowing rather than a conscious assessment of 

what one knows. At the same time, it may be evidence that both legislators and graduate

students deliberated about the privatization decision, which would have made them more

aware of how limited their information about the topic was but would also enable them to

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 158/282

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 159/282

149

legislators justified their decisions with personal or professional experience or by

reference to information without reference to its source (i.e., data).

In most cases, legislators’ decisions appeared to issue from their existing beliefs

or principles rather from conscious deliberation about the legislative proposals presented

in this study, as evidenced by their quick response times and the limited number of 

 justifications legislators offered to justify their decisions. Based on the passages that

follow, it appeared that many legislators’ beliefs, principles, catch-phrases and world

views served as heuristics that substituted for conscious reasoning about the proposed

legislation and its consequences, or the various alternatives to the proposed legislation

and their respective expected utilities.

The best way to present evidence of legislators’ reliance on something other than

decision-specific evidence in making decisions is to present excerpts from interviews.

For instance, to explain his opposition to the proposal to limit class size, one legislator 

observed that we “already spend too much money on public ed” (L38). The same

legislator explained why this proposal brought to mind negative feelings, “I have the

same reaction to a slightly lesser extent when I hear a proposal to mandate anything to

anyone for any reason” and “I’m very reluctant to impose a one-size-fits-all mandate like

you describe” (L38).

In opposing privatization, another legislator explained, “I trust their [the school

 board’s] judgment. I think they are up front with me” (L39). Another legislator noted, “I

have confidence that my school board has the knowledge of the details and the needs,”

 but the legislator “can’t cite any specific” information (L15).

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 160/282

150

Similarly, a different legislator did not support a state-wide mandate on class

sizes because “the closer you get to the situation the more accurate shall we say the

decision” (L9). On a related note, “the negative aspect of forcing people, putting mandate

on people it was sort of an instant a negative feeling about doing something like this”

(L11). Yet another legislator opposed the class size issue with “I don’t support unfunded

mandates” (L16).

Highlighting how differently participants answered these questions, a legislator 

who supported the class size limit did not think data were necessary to explain or support

the decision: “class size is something we can kind of again turn to common sense, yes we

like to see reports you know however fancy they may be but you know it’s a common

sense decision” (L22).

Some legislators spoke in terms of larger philosophical principles. The same

legislator who opposed the class size proposal because it was an unfunded mandate

opposed the privatization proposal, “because I believe that it’s government[’s]

responsibility to provide education and public safety for our citizens” (L16). Similarly, “I

look at it more from a philosophical standpoint. You know I’m not sure what data is [sic]

out there” (L18). A third legislator, in opposing privatization, offered “a little philosophy

I guess” (L21). Passages from more legislators in opposition: “I think it’s just my

 philosophy about it” (L24); “Because I am philosophically opposed to privatization of 

 public services” (L26).

As one of only two legislators willing to support the proposal to privatize, a

legislator explained, “I believe we need to we need to do the best for our children that we

 possibly can” (L23). She continued, “I think that we as, as a government have a

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 161/282

151

responsibility to all the students, regardless of wealth” (L23). A legislator who supported

 privatization opposed the proposal to privatize as it was presented because “this

 particular proposal isn’t how I normally think of things. You are doing it at the top my

 preference [being] bringing in the private I would rather institute school choice” (L38).

For some, no variation of privatization was an option: “I believe in the public school

system” (L34). Also, “I don’t want any for-profit group running my school system,” and

“basically I am a non-believer” (L40).

To explain how she made her decision on class size in terms of her pre-existing

dispositions or tendencies, one legislator acknowledged, “I had some filters already that

it was filtering through rather rapidly” (L9). Similarly, another observed, “You come in

with some natural tendencies in favor or not in favor of certain facts” (L10). One former 

teacher revealed how such tendencies or allegiances influenced his policy decisions,

“I’ve also never been real big on privatizing. Being a labor supporter, a labor person

[ pause] I like to keep the jobs within the public sector I guess I would say” (L37).

To complete the discussion of evidence and rationale for legislators, and to

highlight why there is evidence to support the conclusion that in many instances

legislators were not consciously generating and weighing decision-specific evidence in

making their decisions, we turn to two passages that show how decisions seemed to

 precede reasoning about the decision. Both of these passages are from legislators that

could be considered good, self-aware reasoners based on how much they knew about the

decision topic and how they reasoned about the issues in their think-aloud explanations

of their decisions. First, in response to the question about whether she would support or 

oppose transferring control over public schools to a private company, one legislator who

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 162/282

152

was a trained statistician responded as follows: “I would oppose it [long pause] because

why? I have to think about that one really” (L9). A second legislator who had a masters

in educational studies answered the class size decision question as follows: “my initial

thought is to support it but I’m already seeing some of the problems with that” (L33).

When asked to offer specific evidence in support of her decision, she did not provide any

specific evidence and concluded, “I mean just intuitively it makes sense” (L33).

Turning to the evidence and rationale graduate students offered in support of their 

decisions, contrary to my hypothesis, proportionately fewer students offered external

evidence in response to Question 1 of the protocol than did legislators (Table A3). This

was unexpected given that the graduate students were all working on or had recently

completed doctorates in education. Also surprising was the fact that a greater proportion

of graduate students relied only on personal experience (i.e., personal beliefs, principles,

or experience) in making both decisions (Table A4). On average, for both decision topics

graduate students generated fewer justifications in support of their decisions, which

resulted in lower mean values of argument repertoire, even though they surpassed

slightly the average number of counterarguments generated by legislators. Although the

 proportions were roughly similar, a greater percentage of graduate students reported that

they had not considered or discussed either decision topic previously.

These results were not hypothesized, but these data are evidence that graduate

students, like legislators, were making complex policy decisions without abundant and

arguably without sufficient decision-specific prior information. So, even though some

graduate students considered the decision questions for an extended period of time before

making a decision, that alone did not mean that graduate students’ decisions were the

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 163/282

153

 product of conscious reasoning alone. After all, notwithstanding longer decision latencies

and analysis times, graduate students still produced fewer and less sound justifications

for their decisions than legislators. And graduate students made their decisions in many

cases on the basis of a preference toward supporting or opposing legislation that may not

have been the product of conscious reasoning about prior decision-specific information.

Choice of Decision Model to Describe Decision-Making Processes

Because participants spoke at length about the two decision models and discussed

their decision-making processes while thinking about the model diagrams, these

comments are presented in a separate chapter. Twenty-four legislators (58%) thought that

the intuitive model better described how most people make political decisions, and

another six legislators (14%) believed that the intuitive model better described how some

decisions were made, while the traditional model better described how other decisions

were made (Table A7). Of the remaining 11 legislators, one picked the traditional model,

eight legislators’ responses indicated confusion about the models, and there were no data

for two legislators.

On the question of whether the intuitive or traditional model better described their 

own decision making, more legislators reported that the traditional model alone (five

legislators or 12%) or both the traditional and intuitive models (11 legislators or 26%)

described how they made decisions (Table A8), possibly because of social desirability

 pressures. Sixteen legislators (39%) reported that the intuitive model better described

their decision-making process, while nine (21%) legislators provided unclear responses.

As discussed in Chapter VI, legislators openly acknowledged the influence of 

 preconscious factors in their policy decisions. Their reflections upon their own decision

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 164/282

154

making may be the best evidence collected in this study to justify an affirmative answer 

to the first research question.

However, approximately one in five legislators did not appreciate the differences

 between the two models and, as a result, offered responses that were not clear. By

comparison, no graduate student was confused about the two models or unclear in their 

response to questions about the models. One possible explanation for this finding is that

graduates students studied the diagrams and listened to the questions more patiently and

diligently than certain of the legislators. Another possible explanation is that some

legislators offered unclear responses when asked to select decision models because they

were not used to assessing theoretical models, so they were not able to quickly evaluate

the models presented. Graduate students by comparison may be more likely to encounter 

and evaluate such models. As a result of their prior experience with theoretical models,

graduate students may have been in a better position to quickly evaluate the models in

Figure 1 and Figure 2 and to select between them.

There were other questions for which legislators’ answers were not responsive or 

not clear, however. For example, seven legislators were coded as not responsive to the

question about expert knowledge on the privatization issue. On the same issue, nine

legislators were unsuccessful in generating a counterargument. Three legislators were not

responsive in answering Question 6 for the class size decision. One hypothesis about why

legislators did not always answer the question asked is that there may have been some

internal compulsion or perceived external pressure to make decisions and offer responses

quickly, whether or not the decisions were sound or the responses were clear. However, it

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 165/282

155

could be that the language of the questions was less clear to those not immersed in these

issues and exercises in the way graduate students are.

We now consider graduate students’ selection of decision models. Like

legislators, a large majority of graduate students believed that the intuitive decision

making and reasoning model described how most people made political decisions and

how they themselves made political decisions better than the traditional, purely conscious

decision model. Also like some legislators, some graduate students seemed to have a

lower opinion of how most people made decisions than they did of how they themselves

made decisions. Sixteen graduate students (88%) selected the IDMR to describe how

most people made decisions and two (11%) said most people use a combination of both

models. When asked about themselves, one graduate student (5% ) selected the

traditional model, 11 (61%) selected the IDMR, and six (33%) selected a combination of 

 both.

Discussion

The evidence presented in this chapter questions the accuracy of the traditional

model of reasoning and decision making and lends supports to the hypothesis that

 preconscious processes influence decision making and reasoning about policy questions.

For example, legislators’ decisions to oppose the proposal to transfer management and

control of their public schools from the local school board to a private company appeared

to be based on a gut-level response. Given that only two legislators supported the

 proposal, opposition was widespread. At some basic level, legislators seemed to be either 

open to the involvement of for-profit companies in public education or somehow

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 166/282

156

uncomfortable with it. Those in the second group did not seem to consider the proposal in

a way that left much room for persuasion.

For instance, several legislators who opposed the proposal justified their decision

 by explaining that private schools do not have to admit all students, while public schools

do. In other words, the legislators were saying that if the proposal to transfer control of 

 public schools to a private company becomes law, students with special needs would not

 be admitted. This is a surprising thing for a lawmaker to say, given that they were aware

that the proposed legislation could be drafted to negate this concern. It seemed that few

legislators thought past their initial negative response. In other words, few legislators

seemed willing to modify the legislation proposed in a way that made it more acceptable.

If they had, it would suggest their opposition was based on the specific proposal they

were asked to consider, rather than on the larger issue of private enterprise and public

education. For instance, of those legislators who were concerned that private schools

might refuse to admit certain students, not one volunteered that he or she would be open

to private enterprise in public education if the privately-run schools were required to

admit all students.

By comparison, a legislator who opposed the legislation presented, but who was

open to private involvement in public education, suggested a different approach without

 prompting:

I just don’t trust a private company to answer to the taxpayer when it comes to

things like curriculum and policy. Now, when you, when I originally thought you

said that I was thinking well gosh, I think that would be a great idea to contract all

the things out to the private sector that private sector people are great at doing it

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 167/282

157

which is running a bus service, making sure the heating and air conditioning are

working, making [sure] the grounds are kept well, providing payroll services

[ pause] taking you know taxes out [ pause] doing human resource management.

All those kinds of things that companies do all the time and are really good at

would be a great idea to outsource because then the school system can

concentrate on one thing and that’s teaching kids. (L27)

This legislator’s response raises an important question for the study of political decision

making: why did one legislator propose a hybrid approach that retained public control of 

curriculum while privatizing more routine services like facilities maintenance and human

resource management, while the vast majority of educators opposed the proposal without

any discussion of acceptable alternatives, or even a willingness to consider alternatives?

This question becomes even more pressing when you consider:

how certain legislators were on this issue ( M = 2.4 on a scale from 0 to 3); how

low their self-assessed knowledge was ( M = .73 on a scale from 0 to 4); that

almost half (43%) of the legislators reported basing their decision on nothing

more than personal beliefs, principles, or experience; and, that 3 out of 4

legislators (77%) acknowledged that they had never even considered or discussed

the proposal before the interview began.

If legislators’ opposition was not based on what they knew or reported and they had not

considered the proposal before, where did their opposition, of which they were certain,

come from? It can be argued that this opposition comes in the form of an affective (i.e.,

 preconscious) response to the idea, which is then rationalized or justified through

subsequent conscious processes.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 168/282

158

If, on the other hand, the traditional model of reasoning and decision making is an

accurate and complete description of how people make complex political decisions,

including the decision to support or oppose the transfer of control of public schools to a

 private company, one could expect low self-assessed knowledge to result in low

certainty. If certainty was high, as in the present case, it is reasonable to expect more

decision-specific external evidence or justifications, as well as prior experience with the

decision topic. There is reason to argue that if the traditional model accurately describes

how legislators made their decisions, decision latency and analysis time should be at least

5 seconds (based on how long it took on average for legislators to make a decision about

Question 7, the one question they obviously paused to think about before deciding).

Further, if the traditional model held in all cases, legislators should have selected it as the

 better description of how they made policy decisions.

In terms of evidence of the operation of preconscious influences on decision

making about complex questions, graduate students’ data led to the same conclusions as

the legislators’ data. A subsequent chapter offers a dedicated discussion of how

legislators’ and graduate students’ results compared (Chapter V) and of what graduate

students revealed about their own decision-making processes in connection with their 

discussion of the decision models (Chapter VI).

For the purposes of the first research question, there was little evidence that

graduate students’ decision-making processes were different than legislators’ processes,

since the decisions and responses of both groups supported the hypothesis that

 preconscious processes influenced their decisions in this study. This conclusion is based

on the following results:

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 169/282

159

3 out of 4 graduate students made their decisions on both topics in 2 to 3 seconds;

for both decisions students on average generated fewer than two justifications to

explain or support their decisions; roughly 4 out of 10 students offered no basis

for their decisions other than personal beliefs, principles, or experience;

notwithstanding the limited information they had for the class size proposal,

graduate students reported a high level of certainty that their decision was correct;

a large majority of students reported that the decision questions brought to mind

feelings, ideas or images; and, all but one graduate student selected the intuitive

model or a combination of both models to describe their decision-making process,

reporting that intuitive processes influenced their political decisions.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 170/282

160

Chapter V

COMPARISON OF LEGISLATORS’ AND GRADUATE STUDENTS’ DECISIONS

AND RESPONSES FOR TWO DECISION TOPICS

This chapter addresses the second and third research questions, which are closely

related. The second research questions asks, “Do the decision-making and reasoning

 processes of state legislators and doctoral students differ for more familiar and less

familiar policy issues?” The third research questions asks, “Do state legislators and

doctoral students in a college of education decide and reason differently about

educational policy issues?” This chapter expands upon what has been discussed in

Chapter IV about how the results for the two decisions compared and how the results for 

legislators and graduate students compared.

Comparative Analyses

The results in this section are presented separately for the second and third

research questions, even though there is considerable overlap between the two research

questions in terms of the data that are relevant to each. A portion of these data are

 presented in Table 2 and Table A9 (Appendix D). Table 2 set forth data concerning the

central tendencies and standard deviations for each decision and sample group on a

number of variables (Table A9 presents data on many of the same variables for each

 participant interviewed in this study), although the most significant finding of this study

may be that participants made decisions idiosyncratically.

In connection with the second and third research questions, this chapter presents

results for the following variables: decisions, decision latencies, analysis times, number 

of justifications, the evidence cited in response to Question 1 of the interview protocol

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 171/282

161

and the word count of the response to Question 1, justificatory rationale, argument

repertoire, rebuttals, certainty, self-assessed knowledge, and reported speed to decision.

These results are presented in an abbreviated form in those cases where they were

discussed previously in Chapter IV.

Comparison of Participants’ Decision-Making Processes for Two Decisions

Participants’ Decisions about Class Size Limits and Privatization

In the overwhelming majority of cases, participants supported the proposal to

limit class size in all public schools to 25 students (71% of participants) and opposed the

 proposal to transfer management and control of public schools from the local school

 board to a private company (89% of participants; Table A1). For chi-square analyses of 

these data for legislators and graduate students, see table A10. While the class size

 proposal was more familiar and the privatization proposal was less familiar, a conclusion

 based on the fact that more participants described the second issue as novel (seventeen

legislators (41%) said that the class size proposal was novel, while 31 (75%) said the

 privatization proposal was novel, while nine of the graduate students (50%) said the class

size proposal was novel and 14 (77%) said the privatization proposal was novel) and that

self-assessed knowledge for class size was significantly higher than for privatization

(Table 2), it is possible that more participants supported the class size proposal for 

reasons other than because the proposal was more familiar. For instance, participants

may have opposed the privatization proposal because for most participants it may have

evoked a negative feeling or other preconscious response before they deliberated upon

the question consciously. Evidence consistent with this possibility was presented in

Chapter IV.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 172/282

162

The primary purpose of this section is to describe how results for the two

decisions compared. That most participants supported class size limits and most opposed

 private enterprise in public education may help explain how and why participants’

decisions on the two topics differed. If the central hypothesis is correct and preconscious

 processes do influence decision making, that would mean that most participants began

reasoning about the class size decision while supporting the proposal and most

 participants began reasoning about privatization while opposing it.

Ideological Explanations Offered in Support of Decisions

As discussed in Chapter IV, legislators and graduate students often explained

their decisions in terms of ideology, beliefs, or principles. Comparing how participants

explained each decision, the decision on privatization was more often explained in terms

of personal philosophy or principles. Participants did not speak as often in these

ideological terms about the decision to limit class size to 25 students. Instead, the

decision to support class size limits was based on personal experience in education,

common sense, or empirical data, for example. This was true for both legislators and

graduate students.

So, for instance, in opposing privatization one student explained that “as a

fundamental principle, I don’t see how making a profit could help [ pause] education, and

I, I think that [ pause] that’s a way of increasing the disparity that already exists” (GS2).

The same graduate student explained that she supported class size limits because she

learned as a teacher that individual time with students was essential to helping them

learn. Another graduate student who opposed privatization said, “They [schools] might

 become more efficient, just, but I’m not sure that I like the idea of the, the dollar being,

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 173/282

163

which I’m sure it would be with privatization, with the dollar being kind of the goal”

(GS4). Again, on the class size issue this graduate student based her support on personal

experience and comments from teachers, not on a philosophical position about class

sizes.

These passages suggest that the nature of support for the two decisions varied.

While the class size decision was based on educational experience, legislative

experience, or external data, the privatization decision often appeared to proceed from a

 principled opposition to private enterprise in public education. In opposing privatization,

one student said, “I think education by its very nature requires a non-profit orientation”

(GS9). In supporting the class size limit she explained, “I have yet to see a negative study

on reduced class size” (GS9). Another student sustained her support for a class size limit

as follows, “I think we all know that better learning takes place in a, in smaller groups”

(GS12). By comparison,

a private company doesn’t necessarily need to enlist the feedback from their 

constituents and, you know, you have to look at, just like with private schools,

you have to look at, you know, what their motives are and [ pause] you know,

who’s feeding them money, and different things like that, and will they really,

will you really have as much say in the education of the students as you would

like, so I’d rather have it still public. (GS12)

As one last example of the difference in how each decision was justified by many

 participants, the sole male graduate student opposed privatization because, “I have a real

concern about the efficiency model of most business where, you know, progress is

measured either on return on investment or in terms of greater efficiency over time,

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 174/282

164

leading to lower costs, whereas, you know, efficiency in that kind of definition for an

academic model simply doesn’t make any sense” (GS14). He also opposed the class size

limit, because limiting class size without also providing more resources for schools and

teachers did not make sense.

The data from legislators on citing evidence (Table A2) and justificatory rationale

(Table A4) supported the observation that privatization was often opposed on the basis of 

an ideological opposition to the idea. For instance, for the class size decision 16 (39%)

legislators offered some form of external evidence in response to Question 1, while only

10 (24%) did so for the privatization question. Similarly, when reviewing the results for 

 justificatory rationale, more legislators (18 or 43%) relied exclusively on personal

evidence to support their decision on privatization than for the class size decision (11 or 

26%). These differences did not hold for graduate students (Tables A3 and A4). The

same number of students (three or 11%) offered external evidence in response to

Question 1 for both students. By comparison, seven students (38%) offered only personal

evidence to support their class size decision while eight (44%) did so for the privatization

question. As explained in connection with the third research question, it was not

hypothesized that a smaller proportion of graduate students would offer external support

for their decisions as compared to legislators because it was expected doctoral students in

education would have more decision-specific information for both policy questions.

On average, participants offered more justifications to support their decisions on

 privatization and more counterarguments to oppose their class size decisions (Table 2).

Mean argument repertoire, which is the total number of justifications and

counterarguments offered, was greater for the privatization decision for legislators and

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 175/282

165

greater for the class size decision for graduate students. In terms of the differences in

number of reasons and counterarguments, and argument repertoire for the two decisions,

only the difference in the number of reasons graduate students offered for the two

decisions was significant, t (17) = -2.20, p = .04 (two-tailed).

Participants’ Appraisals of the Partisan Characteristics of Legislative Proposals

As with the policy decisions themselves, there was considerable agreement

among participants concerning the partisan characteristics of the legislation proposed in

each of the two decision questions (Table A6). Most participants viewed the proposal to

limit class size as a liberal position, while the privatization proposal was viewed by most

as a conservative position.

Comparing Response Times for the Two Decisions

Moving from decision frequencies, evidence and partisan topic to response times,

Table 2 shows that legislators decided and reasoned about the class size issue more

quickly than for the privatization issue, while graduate students decided and reasoned

about class size more slowly. However, only the difference between legislators’ mean

analysis time for the two decisions was significant, t (37) = -2.40, p = .02 (two-tailed).

The patterns in Table 2 for response times for the two decisions and the two

groups become less clear when you consider the diversity in participants’ response times

(Table A9). In light of this diversity, the patterns apparent in Table 2 for mean decision

latencies and analysis times for the two decisions and two groups become more difficult

to support. For instance, decision latency for Legislator 1 for the class size question was

longer than decision latency for the privatization question, and for both decision

questions decision latency was longer than analysis time. Yet, the mean decision latency

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 176/282

166

and analysis time values for legislators in Table 2 reveal that the class size decision was

made more quickly than the privatization decision and that decisions latencies were

shorter than analysis times for both decisions. The important point is that mean values for 

each group or each decision may reveal little about each individual’s results and about

how knowledge and experience bear upon each participant’s decision making about the

two decision questions.

It is worth considering whether central tendency data are of more than passing

utility in evaluating how individuals make complex decisions. In this study, response

time is one of several quantitative and descriptive variables used to answer the research

questions. In this chapter, for example, this examination of how the two decisions

compare and how the two groups compare is based on the central tendency data in Table

2, as well as a descriptive analysis of participants’ evidence, rationale, affective response,

choice of decision models and comments about these models. The limitations on central

tendency data are noted, however, to emphasize the differences among individuals and to

recommend that these data be considered in light of the data for each individual (see

Table A9).

Self-Assessed Knowledge and Certainty for the Two Decision Questions

Having discussed the limitations of central tendency data, the discussion turns to

an interesting and important difference in mean values between the two decisions: the

difference in self-assessed knowledge, which was significant for both legislators, t (34) =

4.73, p = .000 (two-tailed), and graduate students, t (16) = 3.33, p = .004 (two-tailed).

Both groups reported knowing significantly less about the privatization decision. This

lower knowledge did not significantly reduce legislators’ certainty that their decision on

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 177/282

167

this decision was correct, but graduate students did show significantly lower certainty in

their decision about privatization, t (14) = 2.30, p = .04 (two-tailed).

Participants’ Comments about Decision-Specific Decision-Making Processes

Although decision model data were not collected separately for each decision

question, participants’ choices of decision models are discussed here in connection with

the second research question (about how the two decisions differ) because many

 participants offered their own theories about how decision making varies by the type of 

decision to be made. That a participant offered a personal theory is the only evidence

collected in this study concerning whether his or her theory accurately described how that

individual made one or both decisions in the present study. At a more general level,

however, these theories were consistent with the data presented in Chapter IV that

 preconscious processes influenced policy decisions and the data in this chapter that the

decision-making and reasoning processes of state legislators and doctoral students

differed for the two policy issues.

In brief, various participants offered one of three hypotheses about how the

 process of decision making might vary depending on the type of decision to be made.

The first hypothesis was that the intuitive decision making and reasoning (IDMR) model

applies to decisions on so-called “hot button” issues, like gun control and abortion in

which people are emotionally invested, while the traditional model applies to decisions

about less-sensitive questions like banking regulation. The second hypothesis was that

the IDMR model applies to questions for which decision makers have little prior 

knowledge, while the traditional model applies when there is an existing decision-

specific information base. Finally, the third hypothesis was that if one had information

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 178/282

168

about a topic the intuitive model would apply, but if the decision topic was novel then the

decision maker would decide in accordance with the traditional model. These models are

discussed in greater detail in Chapter VI in the section on “Decision Models May Be

Decision-Specific.” Although it is not possible to determine in this study whether one or 

more of these hypotheses are correct, that participants volunteered these alternative

models suggests that they may represent what the participants who volunteered them are

actually doing. In other words, that participants offered these hypotheses is evidence that

decision-making and reasoning processes differ for more familiar and for less familiar 

 policy questions.

Differences in How Legislators and Graduate Students Made Policy Decisions

Having proceeded through a discussion of how the results varied by decision

topic, this section addresses the third research question on how results compared for 

legislators and graduate students. The comparisons between legislators and graduate

students were made descriptively and not statistically. There were more similarities than

differences in legislators’ and graduate students’ responses to the policy and follow-up

interview questions. To begin with, most legislators and graduate students supported

class size limits and opposed privatization. This chapter focuses on two similarities and

one difference, and one set of data that serves as evidence of both the similarities and the

differences between the two sample groups. Response times were at once a similarity and

a difference, as explained in the next paragraph. The other similarities were in evidence

and rationale and in choice of decision models. The difference was in certainty. Each of 

these is addressed in turn.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 179/282

169

Comparing Legislators’ and Graduate Students’ Response Times

The data on response times from legislators and students provide evidence of 

group differences and similarities. Looking at the unadjusted mean values for decision

latency and analysis time for the two groups in Table 2, there appeared to be a major 

difference in how much time members of the respective groups took to make a decision

(decision latency) and to offer support for that decision (analysis time). Legislators rarely

stopped to think about their decision before deciding to support or oppose the legislation

 proposed in each decision question, as evidenced by mean decision latencies of 1.36

seconds and 1.87 seconds for the two decisions (Table A9 also shows how quickly most

legislators made each decision). Analysis time was also very quick, although on average

it was longer for each decision topic than decision latency.

Comparing these results to the unadjusted decision latency and analysis time

results for graduate students, as was done in Chapter IV, revealed that mean decision

latency for the class size decision was over 11 seconds and for the privatization decision

was over 8 seconds for graduate students. Similarly, analysis times for the two decisions

were 11 seconds and over 10 seconds for graduate students. These differences are largely

attributable to the responses of four graduate students (see student numbers 3, 4, 6, and

18 in Table A9). These four students spoke at length and reasoned about one or both of 

their decisions before coming to a conclusion about their response to the decision

question–an atypical pattern. For example, Graduate Student 4 responded as follows to

the question about whether she would support or oppose legislation to transfer control of 

 public schools to a private company:

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 180/282

170

Ho, good question [ pause] I don’t really keep track of what the school board’s

 been doing [ pause] I am not generally impressed with the way public schools

have been functioning. Do I think privatization is the answer? They might

 become more efficient, just, but I’m also not sure that I like the idea of the, the

dollar being, which I’m sure it would be with privatization, with the dollar being

kind of the goal [ pause] and I don’t know if more or fewer corners would be cut,

with privatization [ pause] on the surface it sounds like a good idea [ pause] I, but

I, I think it’s a dangerous route to go down for schools [ pause] there’s a reason

that they’re public schools, it’s supposed to be for everybody, and I would be

concerned that somehow down the road, it wouldn’t be as accessible to lower SES

kids, even though in theory they should still all be free, once you start worrying

more about the finances than the education [ pause] I don’t know. I don’t think it

would be a good idea.

From the end of the policy question to the beginning of the last sentence took 82 seconds.

Decision latency and analysis time were coded for this student as 82 seconds because that

is how long it took her to reason about the question before reaching a decision.

Another example is from Graduate Student 3. In response to the question on

whether she would support or oppose legislation to limit class size to 25 students in all

 public schools, she responded as follows:

I, I think it would depend on a lot of things, like how affordable that is, how

feasible that is in a severely overcrowded school where kids would have to be put

in trailers, I’m just not sure, you know, where those trade-offs fall. I think the

Kentucky class size research is, is pretty clear on the benefits of smaller classes,

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 181/282

171

 but I, if I’m not mistaken, that’s mostly, that research is done with, with

elementary school kids. And I’m not really clear on, with older kids, across the

grades, across subject areas, how much of a difference class size makes [ pause] I

could imagine, you know, high-performing high school biology students who

could do perfectly fine in larger classrooms. And low-performing first-grade

students for whom 25 would be much too large of a class [ pause] it would also

depend to me a lot of ways on who else was in the classroom and was available

[ pause] you know, there are places where there are very large classes jointly

taught by two teachers, I think that’s been a pretty miserable failure [ pause] so it

seems to me that there are a lot of [ pause] I would want to have a lot more

information before I felt definitive on that question, and also [ pause] again, it

seems to me it might make a huge difference across what academic subject, what

grade [ pause] I’d want to see research that was very specific to those issue.

This response took 93 seconds from the end of the policy question to the beginning of the

final sentence. Her decision was coded as “support” because she later expressed that she

leaned towards supporting.

The question throughout my analysis has been how to calculate graduate student

response times because four students’ results may distort the data from the other fourteen

students. To this point, this discussion focused on unadjusted response times because

four graduate students reasoned at length before deciding, while no legislators did, which

suggested that there may be an important difference between the decision- making

 processes of students and legislators. In conclusion, it is fair to treat response times as an

important difference between the two sample groups and an important similarity. If you

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 182/282

172

look at all the graduate students together through mean values, graduate students did

reason in greater depth than legislators before deciding. If you exclude the four students

from the central tendency data then the adjusted response times for graduate students

look very much like those for legislators, as shown in Table 2. As a result, most graduate

students may not have deliberated before deciding, which makes them like the legislators

in this study. The next section covers similarities between the two groups.

Comparing Evidence and Rationale for Legislators and Graduate Students

In terms of the number of justifications and the quality of evidence they relied

upon, legislators and graduate students as groups did not diverge to a meaningful extent.

It was hypothesized that doctoral students in education would offer more justifications

for both decision topics and use more external evidence in explaining their decisions.

Although there were only minor differences in the results, as discussed in this and

 previous chapters, on average legislators offered more reasons to support their decisions

than graduate students (Table 2). Furthermore, for both decisions, a smaller proportion of 

legislators offered only personal evidence in support of their decisions than graduate

students. In other words, to a small extent, legislators, more often than doctoral students,

offered external evidence to justify their decisions (Table A4).

The only large difference in terms of evidence and rationale between the two

groups was in terms of word count in response to Question 1. On average, legislators said

approximately 50 percent more when explaining their decisions in response to Question 1

than graduate students (Tables 2 and A9). Of course more is not necessarily better, but

this difference does reveal that legislators and graduate students are delivering, possibly

even thinking about, their responses differently. It is not clear whether the differences in

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 183/282

173

how much legislators and graduate students said in response to Question 1 are

meaningful, however the differences are sufficiently large to merit attention. One

 possible explanation for this difference is that legislators are more likely to explain

themselves and defend their decisions to others, so when asked to support their decisions

in this study they proceeded from habit. Additional evidence of how legislators’

 professional experience might influence their responses in this study is that they were

more likely than graduate students to spontaneously offer rebuttals to the

counterarguments they generated in response to Question 2 of the interview protocol.

Legislators’ and Graduate Students’ Comments about the Decision Models

Since the choice of decision models was discussed in connection with the second

research question and will be discussed again in Chapter VI, for purposes of the third

research question it seems sufficient to say that a large majority of legislators and

graduate students selected the intuitive model as the more accurate description of how

most people make some, if not all, political decisions (Tables A7 and A8). This similarity

 between the two sample groups on this variable suggests that at least in terms of their 

own assessment of their decision-making processes, legislators and graduate students did

not differ greatly.

Legislators’ and Graduate Students’ Certainty about Their Decisions

Aside from the differences in response times, the only other important measured

difference between graduate students and legislators was in certainty. For the class size

decision, legislators reported certainty of 2.47 (SD = .91) and self-assessed knowledge of 

1.90 (SD = 1.55). The results for graduate students were remarkably similar at 2.46 (SD =

.63) for certainty and 1.86 (SD = 1.47) for self-assessed knowledge. On the privatization

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 184/282

174

decision, both groups reported knowing less, with legislators’ mean self-assessed

knowledge at 0.73 (SD = 1.31) and graduate students’ self-assessed knowledge at 0.41

(SD = .79). The differences between self-assessed knowledge for the two decision topics

was significant for legislators (t (34) = 4.73, p = .000 [two-tailed]) and for graduate

students (t (16) = 3.33, p = .004 [two-tailed]). Where the groups diverged was in the

certainty they reported that their decision on the privatization question was correct. While

legislators reported certainty of 2.40 (SD = .95) on the privatization decision, which was

not significantly different from the certainty they reported for the class size decision

(t (39) = .48, p = .63 [two-tailed]), graduate students reported certainty of 1.72 (SD =

1.12) which was significantly different from their certainty on the class size decision,

t (14) = 2.30, p = .04 (two-tailed).

In other words, even though they reported knowing less about the privatization

question than the class size question, legislators were not significantly less certain they

were correct in their decision about it. Graduate students reported knowing less about

 privatization and they were significantly less certain about their decisions about the

 proposed legislation, possibly as a result of their awareness of their limited knowledge on

the topic. This difference between the two groups is important because legislators should

have been less certain that their privatization decision was correct, given that they

reported significantly lower knowledge on this issue. That legislators were not

significantly less certain suggests that there may be a problem in the way they measure

their own certainty, or that they operate in a professional culture that rewards certainty.

Still, it stands to reason that if you think you know less about decision topic A compared

to decision topic B you should also be less certain that your decision on A is correct than

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 185/282

175

your decision on B. Graduate students’ responses followed this rule, but legislators’

responses did not.

That legislators were equally certain could be evidence that for legislators

certainty was an affective signal that was unrelated to how much information they had on

a topic or how much information they thought they had on a topic. Another explanation is

that as trained researchers, graduate students were more disciplined about avoiding

unsubstantiated certainty. Thus, when they knew less about a proposal, graduate students

knew to be less certain that they were making the correct decision on that proposal. In

contrast, legislators’ professional experience may have led them to conclude that

certainty was important, whether or not it was justified by the amount or quality of 

information they have available.

In proportional terms, almost three times as many graduate students as legislators

reported knowing less about privatization and reported less certainty about their decision

concerning privatization. Seven graduate students (38%) who reported lower self-

assessed knowledge and lower certainty for the privatization decision, while only five

legislators (12%) did the same. This could be evidence that graduate students were more

likely to lower their certainty judgments when they had less decision-specific

information. Seven legislators (17%) also reported that they were more certain that their 

decision on the privatization question was correct than they were that their class size

decision was correct, even though these seven they reported having no more (and in some

cases less) information about the privatization issue. Almost the same proportion of 

graduate students (three or 16%) did the same thing, which suggests that graduate

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 186/282

176

students may not be more likely to lower certainty judgments in the face of less decision-

specific information.

Discussion

This discussion section examines the second and third research questions. The

data collected in this study provided evidence of the influence of preconscious processes

on decision-making and reasoning about policy issues. With regard to the second

research question, participants’ decision making and reasoning about class size and

 privatization differed in important ways, although the source of those differences are less

evident. Similarly, there was evidence that legislators and graduate students decided and

reasoned differently, but it is unclear how important these differences are or how

legislators’ and graduate students’ experiences and knowledge might have led to these

differences.

Differences in Decision Making about Class Size Limits and Privatization

There was evidence that participants made and reasoned about the two decisions

differently. What is not clear is whether these differences flow from how familiar or 

unfamiliar the topics were to the decision makers. How much information participants

had for each decision topic may have shaped how they made their decisions about each

 proposal, but there are other processes to consider. For instance, both legislators and

graduate students offered more justifications to support their decision on privatization

than they did for the class size decision, even though both sample groups reported

knowing less about the privatization decision than the class size decision and more

 participants from both sample groups reported that the privatization proposal was novel

than did so for the class size proposal. Based on self-assessed knowledge, novelty, citing

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 187/282

177

evidence, justificatory rationale, and participants’ comments, it appeared that the

 privatization proposal was less familiar. The question then becomes why the mean

number of justifications supplied were higher for the privatization decision than for the

class size decision for both groups.

One response to this question may be that because the privatization issue was

novel for most participants, most thought consciously about the decision question and

relevant considerations while reaching a decision, whereas they simply responded with

their overall evaluative tally for the more familiar class size decision. This explanation

would be consistent with shorter decision latencies and analysis times for the class size

decision and with several participants’ theories that decision makers decided in

accordance with the traditional model for new issues and the intuitive model for familiar 

issues.

Another interpretation of these data is that, consistent with the original design of 

the study, the privatization question was novel for the vast majority of participants and it

was an emotionally provocative issue for many because it aimed to replace a core

function of government with private enterprise. Particularly for legislators, as many of 

them noted, supporting the proposal would have considerable political costs, which some

of them felt viscerally. The excerpts cited in Chapters IV and VI suggest that legislators

often conveyed general negative feelings about the proposal. Their decisions may reflect

these feelings. This was not true in all cases, especially for those few inclined to support

 privatization, but the evidence suggested that a negative affective response influenced

many legislators’ and graduates students’ decisions.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 188/282

178

With regard to the class size decision also there was evidence of preconscious

influences. The decision was made so quickly that conscious reasoning and deliberation

 prior to deciding, given that decisions were offered almost instantaneously by the vast

majority of participants, was unlikely. It is possible that many participants had a positive

affective response to smaller and more disciplined classes, and this may have led them to

support the proposal. At the same time, many participants could have had a negative

response to the added costs and what some described as a misguided reliance on smaller 

class size as the means to improve academic achievement.

An alternative explanation was offered by those participants who theorized that

 people made policy decisions in accordance with the traditional model when they had

 prior information about the decision topic, and operated in accordance with the intuitive

model for novel decisions. Under this theory, as with its counterpart discussed in the

 prior paragraph, quick decision making was hypothesized to potentially be the product of 

an overall evaluative tally that was based on conscious reasoning about the class size

decision on prior occasions, the product of which could be reported almost

instantaneously in the present study.

When asked how quickly they made their policy decisions (reported speed to

decision) some legislators, but no graduate students, said quickly or instantaneously but

qualified their answers by explaining that they were able to answer so quickly because

they had deliberated upon the topic previously. This explanation is an important one for 

the decision-making processes under consideration, and the operation of the overall

evaluative tally must be given special attention, however this explanation does not

withstand scrutiny here because legislators also made the privatization decision very

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 189/282

179

quickly, even though 3 out 4 said they had never thought about or discussed the topic

 previously. The difference in mean decision latency between the two decisions was not

significant for legislators. As a result, the contention that decisions were made quickly

only when the decision maker was familiar with the decision topic did not appear to hold.

There was strong evidence that legislators and many graduate students made both

decisions before reasoning about the decision questions, which is inconsistent with the

traditional model. This evidence undermines the descriptive accuracy of the traditional

model. As to the second research question and how decision making and reasoning

differed for the more and less familiar issues, the results were not as clear. There was

evidence in legislators’ comments that the privatization decision was more often the

 product of a visceral reaction to the proposed legislation, but at the same time decision

latencies for legislators and most graduate students on the class size decision were too

quick to accommodate conscious reasoning prior to decision making. Based on the

research reviewed in Chapter II, there is reason to believe that decision making on both

of these questions was influenced by preconscious processes or signals. As one legislator 

noted, she used the traditional model in most instances, but the intuitive model for both

of the decisions in this study.

[I use] the first model in, in most of what I do but on these two questions its very,

its just a core issue, so it’s not, while I will continue to gather information, my

gathering of the information is tends to be more in terms of trying to be more

effective in the debate rather than assuming that I’m going to take a major change

of direction. (L36)

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 190/282

180

This was so, she continued,

[b]ecause it really is [ pause] to me a absolute cornerstone of [ pause] American

society is the concept of public education, being available to everybody and that

we have a duty to [ pause] educate the young people and not only do we have that

duty, but its also in everyone’s vested interest that we have that kind of strength

in this country. (L36)

Based on the data collected, this legislator may have been correct in her 

assessment that these two questions were more likely to provoke an affective response. If 

the study had asked one educational policy question and one question about a less salient

issue, it is possible there would have been greater differences in how participants

answered the two questions. In that case, the second research question could have been

revised to ask how decision making differed based on the emotional salience of the

decision topic, rather than on the topic’s familiarity. However, as another legislator 

observed, finding an issue about which decision makers were neutral might be difficult.

He began by giving an example of such an issue, but then he concluded that a decision on

that issue also would likely be the product of intuitive feelings.

[T]here are areas where we have no intuitive feelings about it, you occasionally

come up with an issue like whether optometrists should be permitted to put eye

drops in someone’s eye or whether only ophthalmologist should be permitted. I

doubt many people have intuitive feelings about that, on the other hand [ pause]

the intuitive feeling could be “I like doctors and I don’t like optometrists.” “I

 believe in the MDs” you know people have intuitive feelings about doctors versus

chiropractors. So maybe there are some and in a situation like that actually

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 191/282

181

[ pause] really I guess intuitive feelings would govern because you wouldn’t really

know anything about the eye drops one way or the other but if you are kind of 

like pro-doctor you are going to go with the doctors on it. (L2)

Differences in How Legislators and Doctoral Students Made Decisions

The differences between legislators and doctoral students were more obvious than

the differences between participants’ decisions on the class size and privatization

 proposals. In brief, legislators decided more quickly, offered reasons more quickly,

offered more decision-specific information, had more to say in response to interview

questions, were more certain in their responses, and were more likely to offer unclear or 

confusing responses. It was not hypothesized that legislators would have more decision-

specific information, a conclusion that was based on the number of reasons they offered,

how much they said in response to Question 1 (word count; Table A9), citing evidence

and justificatory rationale. Again, contrary to expectations, it seemed apparent that

legislators had more direct experience with the costs and other consequences of limiting

class size to 25 students and transferring control of public schools to a private company.

It makes sense that state legislators would be reasonably well informed about

 public education issues, but there was reason to believe that doctoral students in

education would have more decision-specific information, would cite more external

information, and would have a greater understanding of the considerations and

consequences of each decision given that doctoral students’ professional work is to study

education. In the end, neither group offered much decision-specific information about

either proposal. One legislator explained that legislators had more time and opportunity

to consider political issues than citizens who were not elected officials: “I think 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 192/282

182

legislators [ pause] by definition have much more time to spend filtering the information

[on policy questions] than the non-legislator does who might only have 5 minutes per day

to think about these things” (L41).

Because of their status as elected officials, it stands to reason that not only do

legislators have more time to spend on policy questions but they also receive more

information about policy questions from interested parties. Further, if enacted, the

specific proposals presented could have had direct political consequences for the

legislators, which was not the case for doctoral students. Based on legislators’ comments,

it appeared they treated the proposed legislation in the study as actual legislative

 proposals, so they could have been influenced (preconsciously or consciously) by the

 possible political consequences of supporting or opposing either proposal.

In explaining their decisions or in discussing the decision models legislators

mentioned a number of considerations, influences or pressures (referred together as

“factors”) on their decision making that graduate students did not. For example, these

factors included: the cost of administrative oversight of schools if control is transferred to

a private company; voting with the majority of your constituents on those issues where

the legislator knows what the majority wants (e.g., class size limits); voting with a

committee chairman or with party leadership to improve your prospects in the legislature;

the local school board spends more than half of county funds so privatizing it would be

like privatizing the county council or the legislature; setting local policy at the state level

without providing the funding to enable it is an unacceptable and unfunded mandate;

voting to privatize public schools as a Democrat could end any hope of re-election; and

how your policy decisions would look on an opponent’s mailing during an election

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 193/282

183

campaign, and how your decisions could be cast in an unfavorable light by opponents.

Based on this list, there is reason to believe legislators were considering factors that were

outside of the graduate students’ experience.

This section concludes by returning to the most obvious difference in how

legislators and students made their decisions, the difference in mean decision latency and

mean analysis time. Four graduate students reasoned for 26 to 93 seconds about the

decision question before reporting their decision while no legislator took more than 9

seconds to make a decision. It was hypothesized that graduate students would be more

deliberate in making their decisions, but it is not clear why only four graduate students

spent more than 20 seconds on one or both decisions or why no legislator did. It may be

that the conditions under which legislators make policy decisions require quick and

certain decisions, while graduate students more often decide without similar time

 pressures. If this is the case then it is ironic, given that legislators are the ones charged

with making policy decisions.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 194/282

184

Chapter VI

PARTICIPANTS’ SELECTION AND DISCUSSION OF DECISION MODELS

This chapter presents participants’ comments about the traditional reasoning and

decision making model in Figure 1 and the intuitive decision making and reasoning

(IDMR) model in Figure 2. These results are presented in this chapter in two sections, the

first concerns what participants said when they were asked to select the decision model

that more accurately described how most people made political decisions and how they

themselves made political decisions. The second section outlines the responses of those

 participants who said that the two decision models applied to different types of decisions.

Participants Responses about Decision Models

After Part 1 of the follow-up interview for each decision question (concerning

class size and privatization) was completed, participants were shown the decision models

in Figure 1 (traditional model) and Figure 2 (IDMR model). As they looked at each

diagram, the interviewer described the models, highlighted the differences between them,

and defined key terms. They were then asked which model more accurately described

how most people made political decisions, after they responded to that question they

were asked which model more accurately described how they themselves made political

decisions. If their responses suggested that they had more to say on the topic they were

asked follow-up questions. What participants said in response to the questions about the

decision models led to four conclusions about participants decision making and

reasoning.

The first conclusion was that decision makers’ thinking about their own decision

making is the product of an idiosyncratic process. In other words, while there were

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 195/282

185

explanations and observations that several participants shared about the decision models

and their own decision-making processes, each person responded in a unique way about

how he or she decided and reasoned or why they decided as they did. This was the same

conclusion reached in Chapter V about each participant’s decision-making processes in

response to the decision questions.

Second, the traditional model of reasoning and decision making is inadequate to

describe how participants make complex policy decisions. For instance, in describing

their decisions and decision-making processes, it appeared participants were constructing

their responses on the spot, as evidenced by the halting, disorganized way in which they

explained themselves. There were countless pauses, redirections, and corrections. One

view of participants’ meandering responses is that they are evidence that decision making

about complex questions was not as premeditated and deliberate as the traditional model

requires. As such, these responses could also be construed as evidence that preconscious

 processes influenced decision making about complex questions. This conclusion is based

on both what participants said and the manner in which they constructed and

reconstructed their ideas as they said them.

That participants were apparently constructing their reasons after making a

decision is evidence against the traditional model because that model posits that

decisions and choices of the sort participants made in this study are products of conscious

reasoning. That participants offered disorganized responses about the justifications for 

their decisions reveals something about the decision making process separate from what

it reveals about the process of verbalizing justifications. If the traditional model is

accurate, the questions asked in this study would have caused participants to reason in a

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 196/282

186

systematic and goal-oriented way about their response alternatives before they verbalized

their responses to the questions. If the traditional model is accurate, what participants

actually verbalized would, therefore, reflect the systematic and goal-oriented process

they followed in making a choice. Because participants’ explanations did not appear 

organized based on the objective of selecting the optimal response alternative (i.e.,

maximizing utility), there is no reason to believe that their decisions were products of 

that objective.

Finally, certain participants’ observations suggested that the decision models may

 be decision-specific. The next section presents passages to support the second and third

conclusions.

Participants’ Decision Making Was Subject to Preconscious Influences

In reviewing the two decision models in Figure 1 and Figure 2, a large majority of 

 participants reported that intuitive processes influenced their own decision making and

the decision making of most people. Tables A7 and A8 showed that 24 legislators (58%)

and 16 graduate students (88%) reported that the IDMR model more accurately described

how most people made political decisions (for chi-square analyses of these data, see

Table A10). Another six legislators, for a total of 30 legislators (73%), and another two

graduate students, for a total of 18 (100%), believed that the intuitive model was more

accurate for some, if not all, political decisions. This distinction between types of 

decisions is discussed more fully in the next section. However, to summarize, some

 participants offered responses that indicated that the IDMR model more accurately

described how people responded to “hot button” issues like abortion and gun control,

while the traditional model described decision making about other, less emotionally

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 197/282

187

salient issues. Others hypothesized that the IDMR model might apply to new issues,

while the traditional model applied to questions for which we had background

knowledge. Some hypothesized the opposite, that the IDMR model depicted how we

decided familiar issues while the traditional applied to new issues.

The number of legislators choosing the IDMR model may have been higher if 

more legislators took time to consider the two models and to compare them.

Approximately 1 out of 5 legislators’ responses were unclear, indicating that they did not

understand the models or did not understand the differences between them. By

comparison, all graduate students appeared to understand the models and all of them

reported that the intuitive model was more accurate for some, if not all, political

decisions. This difference between legislators and graduate students may also be the

result of their relative experience with theoretical models. Legislators may not often

encounter such models and therefore may be less adept at quickly evaluating and

discussing the models.

The following is an example of a legislator’s confusing response about the

decision models. The question asked which model more accurately described how most

 people and how Legislator 5 made political decisions, and he responded as follows:

I mean, there kind of you know two unique situations, I think in a political

situation, I think the vast majority of people given the proper information would

use number 1 [Traditional] however the second format is giving the proper 

information and then trying to measure that against the political climate, in other 

words what’s favorable in supporting or not supporting.

[My question: So it may be situation-specific then?]

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 198/282

188

So in other words you may have the best theory and it may make all the sense in

the world, but politically you can’t explain it [ pause] in a small [ pause] context or 

in a soundbite, the it might not be feasible for you so you have to find another 

way to do it. It’s like to say we can have the best educational system but it is

going to cost us X number of dollars and we need to raise taxes and this is why.

Could be the great argument, it takes you 30 minutes to explain it and the guy

next to you says I am against taxes, no new taxes you know the schools are bad,

the teachers are bad, everything is bad, no new taxes, they have enough money,

they don’t use it property. If that seems to be the commodity that’s selling you

you’re not going to use the first model. You’re going to try to figure out a second

model. (L5)

Another response about the decision models coded as unclear came from

Legislator 6:

[My question: Which model more accurately describes how most people make

 political decisions?]

I would have to say yours, [the intuitive model] it is very, very good because most

of the legislators, in spite of what the press and what a lot of people say are very

conscientious about what we do and how its going to affect down the road more

or less not just, gee I’m here and I want this to happen.

[My question: Do you think then also for your own decisions, this is a better 

model?]

Yes, much better.

[My question:You do have this sort of holistic feeling or intuitive decision first?]

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 199/282

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 200/282

190

differences between the two models, and particularly those participants who mentioned

having thought about their own decision-making processes independently prior to the

interview, showed remarkable insight in discussing the models and how preconscious

influences might shape political decisions. Some comments were entirely consistent with

the literature reviewed in Chapter II.

Only as many excerpts from participants’ responses as are necessary to give a

sense of the range and depth of their thinking about their own decision making are

included here, and they are cited in their entirety when appropriate. As a result, this

section is comprised primarily of excerpts from interview transcripts.

The clearest comments about the possible influence of preconscious processes on

decision making and about the shortcomings of the traditional model were in a graduate

student’s response to questions about the two models. After the two models were

described to her and their predictions about the decision-making process explained, this

student selected the intuitive model as the more accurate representation of how most

 people made political decisions. She continued as follows:

I mean, even just having gone through those two cases [the two decision topics], I

definitely did that [felt an intuitive decision]. Maybe not [ pause] equally [ pause]

wasn’t equally strong in both, but I definitely did that.

[My follow-up question: Okay. The follow-up question to most people is how do

you make (political decisions), but you answered both, sort of, you know, at the

same time.]

Yeah, and I’m trying to think of a case, actually, where I don’t do that [feel an

intuitive decision first]. And I, I don’t know that I can. It’s not something I’ve

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 201/282

191

thought about. But now, reflecting on some of the decisions that I make, I think I

definitely make them at a gut level. And then I rationalize my way through that

[ pause] and I’m not even always sure why I have that gut feeling, and couldn’t

even probably tell you why I always have it, but I have it. (GS16)

Her description of the decision-making process consisting of an initial intuitive decision

followed by post hoc rationalization is consistent with the studies cited in Chapter II

indicating that people rarely know why they make the decisions they do because the

 processes that caused the decisions are often not available to conscious reflection (e.g.,

Damasio, 1994; Epstein, 1990; Nisbett & Wilson, 1977).

 Not all participants offered this sort of extended explanation of their own

reasoning processes. Some answered the questions about the models in a couple of 

sentences. For example, one legislator answered the question asking which model better 

described political decision making as follows: “Interesting. I actually think most people

 probably make use [of] this model [Intuitive]. I actually think I tend to use the that model

[Traditional]” (L1). This legislator seemed to understand both models and the differences

 between them, but she did not feel it necessary to explain her choices. It is not possible to

determine how well this legislator or other participants who responded without

elaboration understood the models.

By comparison, in response to the same question and without prompting, the next

legislator interviewed responded as follows to questions about the decision-making

 process:

Well, if you are just talking about the reasoning process [ pause] Devoid of 

 politics then the one about the preconscious influences certainly but you gotta

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 202/282

192

keep in mind that because obviously all of us have memories and experiences and

[ pause] predilections that we, either guide us or we have to overcome in order to

do what we think is the right thing to make the reasoned decision as your box says

 but you gotta keep in mind which and I know if, if this is simply the reasoning

 process then that is fine but if there’s the issue simply of money [ pause] who is

donating money and I think that I have come to the conclusion trying to work 

what I just said into your model that most legislators that who receive political

donations for a variety of reasons don’t allow themselves to think that they have

 been influenced by this because you know obviously that’s you know we don’t

want to be viewed as having been bought, so what happens is you know you get 

the donation and that becomes part of your preconscious. In other words, it

 becomes part of that good feeling or bad feeling you have and I do think that it

 but it gets kind of blurred in that feeling you know you just have a good feeling

about the teachers or you have a bad feeling about the teachers you have a good

feeling about the horse racing people or you have a bad feeling and so there are

many, many things that are obviously going to go into that preconscious feeling

as well as do you have a lot of teachers in your district, or a lot of horse racing

farms in your district but you’re diagram I think would be you know the

 preconscious is the thing that governs most of us. (L2) (emphasis added)

This legislator’s comments about the influence of donations on legislators’ decisions, and

how donations might unconsciously sway even a scrupulous and self-aware legislator, are

enormously important to the study of legislators’ decision making, campaign finance and

 public policy.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 203/282

193

In response to the same question about which model better described how most

 people made political decisions, another legislator offered an assessment of how intuitive

 processes shaped his decisions, and how he consciously tried to understand and limit the

influence of such processes.

Let me say that one of the things that I have thought about in this process and I

have given some thought as to how I make decisions. I came to this office with a

frame of reference that was the result of my life experiences and that is the frame

through which I see legislation as it comes before me and I’ve had to understand

that that frame of reference may have brought with it some biases and so when I

go through this process of conscious reasoning because I understand that I have

got an intuitive sense given my life experiences that I have to ask myself some

questions to be sure that I am not projecting my bias into decisions. (L4)

The idea of a frame of reference that filters or colors information and orients or 

directs one’s decision making was echoed by other legislators. For instance, “Yeah, I

think that’s how most people make a [decision]. First they make a decision based on

whatever their frame of reference is, then the question is will they be willing to change

their decisions, I guess if data is [ sic] presented that contradicts what their intuition told

them” (L26). In terms of preconscious processes, a frame of reference could be described

as a cue, heuristic, schema, or affective signal. Or a frame of reference could be

described as a mental model or mental representation of the decision question that is the

 product of preconscious processes.

After selecting the intuitive model to describe how most people and how he made

 political decisions in eight words, another legislator offered the following candid

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 204/282

194

explanation of how unsystematic the legislative decision-making process could be, after 

 being asked if there was anything he would add to either decision model to make it more

complete.

I think that in, well part of the intuitive decision I think is [ pause] personal

ideologies, I mean a legislator chooses a political affiliation because of personal

ideologies and I think that is a big [ pause] intuitive part of his decision-making

 process and, and the main reason I say that is because in the session we you know

we have committee system here in [name of state] and there is a lot of bills and

legislation that come through that we are required to vote on that we absolutely

 just don’t read so based on the title of the bill or [ pause] if we do get a chance or 

who is sitting around us that we can look to for help [ pause] it’s a gut feeling and

 based on your personal ideologies because you know I am on a health committee

so I focus personally just on health issues. I don’t focus on budget issues, I don’t

focus on economic issues or anything, I am staying focused on health issues.

[My follow-up question: So you’re saying when it comes out of committee and it

is on the floor for a second, third vote. And then you, let’s say its an economic

issue or it’s an education issue and you haven’t been briefed on it as you, then

you are sort of using these sort of cues, you know who brought in the legislation,

who is supporting it?]

I would say probably 80%, 85% of legislators do the same thing because I mean,

it’s, there is a lot of legislation that goes through here. I have 3,000 bills in a

course of a session, 90 day session [ pause] and I think that a lot of people when

they bring in an idea for a bill they need to consider that and whether how they

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 205/282

195

title the bill or what subject matter is, because a lot times you just don’t have

time, especially towards the last 30 days some more pretentious [ sic] bills are you

know will get some pretty good debate on the floor which tend to [ pause] educate

a lot of people right then and there so you have an idea of what it is about [ pause]

you know we have a copy of the bill in front of us and a lot of times we will be

able to scan through what it is about but not the details of it [ pause] but you know

a lot of times you just look around to someone that was on that committee that

heard the hearing and you go, “up or down?” you know and [ pause] they usually

would tell you. A lot of times you learn to respect the opinions of like-minded

individuals [ pause] and those are the people you look to, so you [ pause] and a lot

of times that’s not by party affiliation either, I mean I sit with a very like-minded

individual right to my right on the floor and [ pause] we are in opposite parties.

We are almost identical in our ideology you know in our personal beliefs so we

trust each other’s opinion on a lot of legislation. And then there is the politics of it

over and above that. Sometimes politics rule the decision. (L16)

Instead of explaining that his decisions were based on reasoning about decision-specific

information on each piece of proposed legislation, weighing evidence and considering the

consequences of various alternatives before selecting the alternative with the highest

expected utility, this legislator talked about basing his decisions on personal ideologies,

gut feelings, the title of the bill, or who was supporting it.

Another legislator from a different state that has an even shorter legislative

session offered a similarly persuasive challenge to the traditional view that political

decisions are the products of conscious reasoning about the subjective expected utility of 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 206/282

196

the range of possible decision alternatives. First he answered which model was more

accurate.

Well, I mean it’s got to be this one.

[My question: The intuitive model?]

When something is thrown at you [ pause] like a bill, an easy way to think of it

[ pause] I can't, this model [the traditional model] suggests say OK we’ve got this

 bill, let’s research the bill and all 140 people in the [name of state] general

assembly and however many, they got some crazy number in [name of 

neighboring state] are going to research it because it’s before them. Well, that just

isn't the way it works [ pause] the bill comes in, say I have it in a committee the

next day and I read it and I run it through my filter, my philosophical filter which

is probably what you've got here [on the intuitive model] in one form or another 

[ pause] and I come to my initial conclusion and then if it is important to me, I'll

do the research or I'll bat it around some more. Just remember in [name of state], I

don’t know what the pace is like in [name of state], but in, I've only been in one

year. We went through like 3,000 bills in one 45-day session.

[My comment: It’s about the same in [name of neighboring state] but they have a

90 say session.]

Okay [ pause] and we are I’m now going into a 60-day session and I am told by

grizzled old veterans that the two week difference makes all the difference in the

world [ pause] but there’s only a few bills that I feel the need to go beyond this

[the intuitive decision]. (L38)

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 207/282

197

Several other legislators from both states also explained that they had no choice

 but to rely on intuitive decisions in the place of conscious reasoning about each piece of 

legislation, given the time constraints and the volume of legislation. “And the reason, one

of the reasons I say that is that in the [name of state] general assembly we consider 3,000

 bills in 45 days, so there’s not time. You [ pause] you bring to it you know as you say,

your background sort of your intuitive response I think, I think that is right. And then you

hear what people have to say, which may change it or it makes you think more about it”

(L31).

Many legislators chose the intuitive model and then explained that time

constraints and intuitive decisions made persuasion very difficult so that intuitive

decisions often prevailed. For example, one legislator observed that it was hard to

“overcome that [the initial intuitive decision] in the legislature because [ pause] our 

legislature is like two months and it’s like that crucible just like you’re getting crushed

from every single side so to have to take some time to get somebody to get over their,

their first impression where they say no and then get them the information, there is no

time, it’s just more difficult” (L31). And, as another legislator from a different state

observed, once legislators have an initial gut feeling about legislation, if they “don’t hear 

otherwise [in the form of evidence contrary to their initial leaning] then you tend to get

hard and fast in your position. Once you’re there, very few of us change our position on

issues” (L20).

Certain participants’ responses indicated that they interpreted the intuitive model

to be inferior to the purely conscious model, but they still used it to describe their own

thinking.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 208/282

198

Lord, well, you know, the socially acceptable what’s the word, [my comment:

desirable] yes, you know the thing would be this first one [traditional]. You know

listen it’s a combination. I have my prejudices, I prejudge based on you know my

own sort of philosophies that I carry around with me and [ pause] you know those

 probably get in the way of hearing everything out that maybe I need to hear in

terms of making conscious you know reasonable decisions. But I attempt to,

 believe me sit through hearing after hearing and I attempt to do the traditional

reasoning and decision making model but [ pause] sometimes it does just comes

down to sort of my own my own intuitive gut-feeling, bottom-line philosophical

feelings about certain things. I believe many of my colleagues more frequently

than I do, do this intuitive. (L24)

Other legislators also selected the intuitive model even though they mentioned that it was

inferior. For instance, “I would like to say that I am better than that, but you know I’m a

human being” (L27). Or, in response to the intuitive model’s prediction that a

 preconscious decision influences the final decision, “perhaps I get mentally defensive at

the suggestion that I wouldn't think through it” (L39).

To illustrate the influence of preconscious processes on political decision making,

as described by participants themselves, consider one final passage from another 

legislator who selected the intuitive model to describe how most people and how he

himself made political decisions. After selecting the intuitive model, he continued:

I don’t know, if I thought about it, I could probably come up with different

terminology [than what is set forth on the model diagrams]. We all come up with

certain preconceived notions and they are based on it, not just the informational

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 209/282

199

stuff that you’ve got listed here, they are based on [ pause] attitudes and

ideological bents and [ pause] personal experiences and so on. [These influences

on decision making don't] necessarily relate to information, they just relate to the

whole of what it is that makes you who you are you know and so if you lump all

that together and call it intuitive [ pause] it would be the factors that would go into

it.

[My follow-up comment: I mean the terms in [the] literature are things like

worldview, first principles, you know or principles in which you make decisions

and act on them.]

The world would be a hell of a lot simpler and the legislative process would be a

lot simpler if the first model did in fact [relay it] you know but where we get all

mixed up in this stuff, you have trouble getting to the conscious reasoning

 process, where you bring in information and so on because people already have

their own hang ups. They already figured it out, they already know so when

confronted with information, they either, one don't listen to it mold it to their, spin

it to their purposes or whatever, and then make it part of their decisions so. I

suspect what a whole lot of us do is, is that we have, we have [an] inclination, we

are, we are generally liberal people, or we are, we make these decisions like this

and then we take this stuff up here and hang it on to justify it.

[My second question: Which model do you think better describes how you make

 political decisions?]

I'm an opinionated character who has been around a long time and it’s a sum

total, sometimes I, sometimes I would describe it as being [ pause] particular 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 210/282

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 211/282

201

[ pause] and you know I kind of restrain myself internally and say okay you know

continue, you know listen completely to the question or idea and what have you

 but [ pause] and, and so I formulate this intuitive position [ pause] or I recognize

this intuitive position on the issue and may if asked right then and there where do

you think you are on this? I would say, well I am inclined to oppose this but then

always go back and apply a more deliberate process, listen to both sides of the

story, you know the issue, do a little bit of independent research, see what my

colleagues think, try to identify the competing interests on an issue. Sometimes

issues are determined not by the substance of the policy but the effect [ pause] and

the different effect on different players that are involved and then come back and

reformulate my, my position so for me I think it is a combination of both and it

moves in both directions. (L13)

One final example of a legislator explaining how she exerted conscious control

over intuitive processes to decide in accordance with the traditional model came from

one of two legislators who supported the proposal to privatize public schools.

The traditional tends to be what I’m, I mean yeah. I mean, I, I have to honestly

tell you as an issue would come up, there are sometimes those feelings but you

know in terms of actually getting to the decision I think, but then see part of the

feelings are because of prior experience and prior knowledge and you know so its

kind of hard to separate it but [ pause] I tend to I think [I’d] be more the first [the

traditional model]. But I have seen an awful lot of people that, that do kind of a

gut level response without getting any facts to it. I don’t know if that’s necessarily

what you have here [in the intuitive model], but I do see that happening. (L23)

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 212/282

202

Many decision makers may try to take into account their dispositions, biases, and

 preferences to ensure that their decisions are rational, however cognitive limits, time

constraints and information costs likely make such conscious regulation the exception

and not the rule.

Participants’ Reasons Were Constructed while Responding

Three excerpts in this section illustrate that many participants’ responses to

questions about the models and about their own decision-making processes were not

 products of a systematic and organized process based on conscious reasoning prior to

responding. These responses indicate something different: an immediate leaning or 

decision followed by conscious reasoning while responding.

These excerpts are included in their entirety to illustrate several important points.

First, these responses, which are representative of many other responses, show that

 participants were thinking about and creating their responses as they spoke them, rather 

than thinking about the responses and completing the reasoning process before speaking

them (as indicated by the traditional model). These responses can be interpreted as

evidence that people sometimes generate explanations for and rationalize their decisions

after they have made a decision. While it may be because conversation is not a formal

way to communicate, participants’ responses in this section show that few if any

 participants proceeded in a systematic way in responding to interview questions about the

decision models. Second, presenting the entire response illustrates how legislators and

graduate students spoke about complex decisions with very little obvious monitoring or 

quality control of what they uttered. In other words, each of these responses could have

 been considerably shorter and more focused, but they were not because participants were

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 213/282

203

not self-regulating. Finally, these long responses are presented in their entirety to

illustrate that even in a face-to-face interview setting which was recorded, participants

decided and reasoned about their decisions in a very informal and free-flowing way,

which is offered as evidence that “real” complex decisions are often made in the manner 

 participants made the decisions in the present study. In sum, decision making about

complex questions does not appear to be a systematic process.

One graduate student, the only one of 59 participants who asked detailed

questions about the two models before answering them, in discussing her responses about

the decision models provided the following summary of her thoughts.

Okay, well, I think the key is with these two [models], is that it, now that you’ve

 put it into terms of, in other words, I would say for, they’re, I almost want to say

that, that both of these models might fit, it’s just a question of the topic that

you’re asking about. I mean, the topics you’ve cited, for this one in particular,

have been very controversial, inherently emotionally laden, okay, abortion’s

another. So [ pause] you know, more of the things you were asking today, I mean,

for me, actually there were some that were more personal, so they had kind of an

emotional component, but I just wonder whether, I mean, maybe this is the, then

this would be the model, but sometimes you don’t really have much of an

intuitive decision, because it’s not something that you’ve felt much about or been

exposed to much. So I mean, I would say, I guess, this would be the most

comprehensive model, but there’s not necessarily always an intuitive decision.

(GS15)

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 214/282

204

Another graduate student answered the question about which model better 

described how she made political decisions as follows, without any intermediate

 prompting by or questions from the interviewer.

You know, actually, that’s a good, that’s a good question, because I think about

that, and it’s actually something that I think about, you know, how do you decide

who to vote for? And [ pause] I was looking at this [ pause] debate, what was it,

the [ pause] [Iraq], yeah, about the, yeah, I was looking at that, and [ pause] you

know, part of me, you know, I had this like negative visceral reaction, you know,

oh, I hate watching these things because, you know, it’s just a show, they don’t

really say what they’re gonna say, and blah, blah, blah, blah. And so, my first

reaction was to turn it off, and to not even pay attention to it. But then, you know,

I said, well, no, let me look at it and hear what they have to say. And so, you

know, I was looking at, not so much what the candidates were saying, but, you

know, the people who got up and, [unintelligible] the audience, versus the people

who sat down in their chairs, [unintelligible], people who snickered, or, you

know, what kind of things, and so, it’s some of those subjective things that, you

know, it’s like, you know, I kind of like this person or kind of not. And I think a

lot of people do that, you know, that they don’t necessarily pay attention to the

content of what people say. I think they just kind of, so [ pause] for my style, I

think I, I think I do a little bit of both. I mean, I think I try to make an effort to get

factual information, and to try to remember what people have done in the past,

and kind of go on their record, and to see what the, you know, what the [ pause]

environment is like, you know, what are the things that the United States needs,

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 215/282

205

and blah, blah, blah, or whatever, my community [ pause] and I think more

logically about it. But then on the other hand, you know, you get somebody, who

was it, one of the delegates in my county was running for something, and he made

this off-hand remark [laughs] about how, well, you know, all of the Hispanics get

landscape jobs, and all the Asians work in nail salons. And [ pause] and just from

making that comment, you know, I was just like, you know, this person, if you’re

gonna categorize people like that, you know, I’m not gonna be voting for you. So

it didn’t matter what platform he stood for, his personality just didn’t, you know,

so I think it can be just like that, so [ pause] so I think I can flip-flop [ pause] I

think most people do the intuitive thing, though, and they don’t necessarily gather 

all the information that they, that they need to, to make a good decision, or reflect

on, you know, what has this person really been doing, like Arnold

Schwarzenegger, you know, you can look, okay, what kind of, you know,

 political experience does he have, what has he done, you know, that makes him

 be such a good candidate for governor, you know, so, it’s like, no, he’s, you

know, he looks good, you know, and whatever, so we’ll vote for him. You know,

it’s just, I think most people, and I, I know I’m swayed by the intuitive stuff, but I

try to be more traditional in making political decisions. (GS12)

To make clear that this sort of prolonged or constructed explanation was not

limited to graduate students, here follows one of many examples of a legislator 

responding in the same way when discussing the decision-making processes in

connection with the two models.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 216/282

206

Well, it is such, you have to understand what we are working with down here. I

am not sure they even know and I am not sure, I would say yes, the majority of 

 people do traditional decision making. In the general assembly I would say to you

that a great deal of the final decision making has a lot of intuitive decision

influenced by the public. The public which is who we should be influenced by.

They are the people that send us down here to work for them. I mean I feel like I

work for you [ pause] and even if [my assistant] works for me, I feel like I work 

for her out in the community [ pause] in her best interest so I you know I would

say definitely in this job, a traditional decision making, running my house is

decisions that I have done, you know raising my children and running my house

and still have a career and [ pause] and a family life and all those other things put

together but then whenever you are in I guess in a business [ pause] it has, a

general assembly in essence is a business that is driven by the public so then I

would say that [ pause] in those decision making it would be intuitive decision

making. But I would say that generally, overall most people that I know usually

do traditional reasoning and mainly in decision making and [ pause] but of course

during general assembly, during this process I would say there is a great deal of 

intuitive decision making. (L7)

While these and other participants’ responses to interview questions suggest that

decision making about complex questions is not systematic, which is contrary to the

traditional model, it is worth noting that participants’ public explanations of their 

decisions may not necessarily reflect the reasoning that underlies those decisions.

Participants may not have said what they actually thought or did in response to the

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 217/282

207

decision questions, either because they did not want to reveal their actual reasons or 

 because they did not know the actual reasons.

Decision Models May Be Decision-Specific

In selecting which decision model more accurately described the decision-making

 process, many participants said that the appropriate model depended on the type of 

decision. These responses were coded as “both” in Tables A7 and A8. Participants who

said that decision models varied by the type of decision to be made offered three

hypotheses. This section summarizes participants’ theories and cites one or more

 passages from participants in support of each theory.

The first hypothesis is that the IDMR model applied to decisions on so-called

“hot button” issues like gun control and abortion in which people are emotionally

invested, while the traditional model applies to decisions about less-sensitive questions

like banking regulation. As a result, when faced with a deeply-felt issue, an issue that lies

at the core of one’s emotional system, the intuitive model would apply. Feelings on such

issues are strong and immediate and the decision maker is certain of his or her position,

so conscious reasoning about the issue is unnecessary. In contrast, on issues that evoke

no emotional response, the decision maker would have to search for information

consciously in order to decide. Therefore, the traditional model would apply in many

cases. When asked which model better described how most people make political

decisions, a legislator replied as follows.

[T]he typical politician here is kind of riding both [models] but I really think that

it depends on the issue. You know I think some issues [ pause] you know like

[legalized gambling]. You know you may get more of [ pause] you know just a

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 218/282

208

visceral “No way!” without thinking about what that means, what that doesn’t

mean what the dynamics are, what the evidence is, what the data is [ sic],

economics of it so that you know some people are just going to think you know,

no way, no how gambling you know [ pause] On other issues and even with that

issue depending on the legislator or the person, they may actually go you know

with your first model [traditional] you know I think it just depends now, so what

do we do more of? You know, I think, I think it really just depends on the issue. I

don’t think I can really pin it down. You know sometimes [I] may be under this

model, sometimes I may be under this model. (L18)

Several legislators offered very similar comments concerning issues that evoked

an emotional response compared to less provocative issues.

[I]t really depends on the issue. I think when we start talking about

education, we start talking about [ pause] you know issues [ pause] issues

that spark emotion [ pause] you know we are going to bring in [ pause] you

know experiences with you know we’ve all had you know children you

know, I don't have any children yet but you know just the [ pause] I guess

the emotional side of it. You know if I was looking at you know a budget

issue, if I was looking at [ pause] something that really doesn’t bring a

[ pause] I don’t want to keep using the word emotional but you know the

intuitive decision you know really I, I guess sparks emotion for the most

 part. You know budget does not, budget is cut and dry you know my work 

life dealing with a client you know is cut and dry you know I don’t think.

I'd say I would be more in line with the traditional reasoning, that two

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 219/282

209

issues. I would say the second issue we discussed would be you know

 board of education [privatization], I'd probably use more of a traditional

reasoning approach. The first issue [class size] I can assure you I use the

intuitive decision [ pause] approach. (L22)

Other legislators described the opposite of what Legislator 22 noted, with the

 privatization issue being decided intuitively and the class size issue being described

consciously. For example,

From my perspective I think I am much more on the traditional reasoning and

decision making model. I [ pause] and I recognize that I certainly have never done

PhD-work but I did post-undergraduate work in public policy from an economic

 perspective and so that’s kind of I mean I, I'm trained in that fashion. Although

even then there is still intuitive elements when you asked the first question [class

size] I would say my response was [ pause] was clearly based upon the first model

[traditional]. I actually thought through my mind about what have I heard about

this, you know quickly I thought about what you know what were your what were

my previous thoughts on it. When you asked the second question [privatization]

which was, “Gee do you want to turn your own school system over to a private

entity” there was more of an intuitive element, I knew instantly that based upon

every, all the inputs I had that [ pause] that no I didn't want to do that [ pause] that

there was some intuition, some greater intuition with number 2 than number 1.

(L41)

Whatever the case may be, these comments suggest that we may make decisions

differently depending on the issue.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 220/282

210

The second hypothesis was that the IDMR model applies to questions for which

decision makers have little prior knowledge, while the traditional model applies when

there is an existing decision-specific information base. According to this view, when we

have little or no knowledge on the decision topic, we decide based on our intuitive

response to the question. When we have information on the topic, then we decide in

accordance with the traditional model by thinking about the information we have

available. As one legislator described his theory,

You know it’s probably, a lot probably depends upon what the decision is, what

kind of background you have, what kind of information, what kind of resources

you have to go into [ pause] a reasoned process rather than an intuitive one. You

know if it is something that you don’t have a lot of experience in or a lot of 

knowledge about it, I think you rely on your intuition whereas if it is something

that you really had a lot of experience and knowledge and research and reading or 

dealing with somebody who was knowledgeable and you’ve taken their thinking,

then I think you probably move into this more reasoned model. (L17)

Finally, the third hypothesis is the opposite of the second. That is, if one has

information about a topic the intuitive model would apply, but if the decision topic was

novel then the decision maker would decide in accordance with the traditional model.

Accordingly, if the decision maker had not considered the question previously, the

traditional model would apply because the decision maker would have to gather 

information on the question before making a decision. If, however, the decision maker 

already had information on the topic, a decision could be based on an intuitive response

that was the product of this information. This hypothesis is based on the operation of an

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 221/282

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 222/282

212

 based on prior experience, debate you have heard in the past, things you have

tried that haven't worked out and so I can't say its purely intuitive because you

have that conscious reasoning process in the past but you are basing a lot more on

intuition because you just understand this isn't going work. (L35)

As a variation of this third hypothesis, certain legislators distinguished between

legislation that they had considered in one or more prior legislative sessions and novel

legislation. If they had considered the decision before they said they would rely on their 

intuitive response, which would be based on their prior conscious consideration of the

issue. On the other hand, if the legislation was novel, then they would look for 

information on the issue and base their decision on that analysis. As explained in the next

discussion section, it can be argued that this sort of independent analysis is unlikely even

for novel issues. For instance, in the present study most participants made decisions on

novel topics without any consideration of decision-specific information.

Although participants’ hypotheses do not take into account the evidence in

Chapter II that people invariably have affective responses to all environmental stimuli so

that the intuitive processes would operate in all decisions, the hypotheses offer insight

into the design of future decision models, as discussed in Chapter VII.

Discussion

Participants’ model choices and their associated comments are important for at

least four reasons. First, the fact that most participants chose the intuitive model to

describe some, if not all, of their policy decisions suggests that the traditional model is

not accurate in all cases. This evidence is in line with the central hypothesis of this study

that complex policy decisions are not the products of conscious reasoning alone.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 223/282

213

Participants’ assessments of their own decision making indicates that this line of research

is a promising one.

Second, participants’ comments about the models suggest that how we make

decisions could depend on the decisions themselves. Certain decisions might be the

 product of conscious deliberation while others might be based on preconscious

 preferences. Participants’ analyses of their own decision-making processes could help

create better models of decision making about complex questions. In several cases

 participants mentioned that they had never really thought about their own decision-

making process, and that thinking about the two model diagrams was a useful exercise

for them. Legislator 27 made this point in response to the final interview question. Each

interview ended with a question by asking participants to rate their interview experience

on a scale from 0 to 4, 4 being highest. This legislator responded positively to the

interview experience, as follows: “Oh I, a 4, it was actually, I thought you were going to

 be just kind of a pain in the ass. And actually, I got a little bit out of this thing too [ pause]

I never really thought that that was what I do until I saw the chart [the diagram of the

intuitive model]” (L27).

Third, how a person makes policy decisions depends upon the person. This is to

say that there are self processes that bear upon the decision making process and that the

major omission of the traditional model of decision making may not be that it neglects

 preconscious influences on decisions but that it ignores the self in analyzing decision

making. This is a large point that is only addressed briefly here and in the next chapter,

 but it emphasizes that any investigation of decision making and reasoning must take into

account who the decision makers are. People are the products of their biological

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 224/282

214

characteristics, their personal and professional experiences, including education, and the

 preconscious and conscious dispositions, values, and principles that influence or are

influenced by these characteristics and experiences. So, the participants in this study

were not merely or primarily legislators or doctoral students, they were much more

dynamic and complex. Accordingly, participants decisions and interview responses must

 be considered in the larger context of what made participants who they are, and how their 

characteristics and experiences may have shaped their decisions and responses. Although

it is beyond the scope of this study, participants’ self-concept or self-efficacy, among

other self processes, likely played an important role in their decision making. It may be

that a major contribution of this study is to bring the self into the study of decision

making.

Finally, in response to what some participants said about deciding in the way

depicted by the traditional model in Figure 1, there is reason to believe that it will be only

in rare cases that decision makers can and do commit the time and resources necessary to

make a deliberate decision on a policy question. Several legislators said as much, noting

that legislators faced too many decisions during the course of a short legislative session

to make decisions in the manner suggested by the traditional model. Based on the short

decision latencies and analysis times, and the decision-making processes observed when

interviewing legislators, it stands to reason that legislators are not likely to independently

and consciously investigate all proposed legislation before making a decision. During

committee hearings they may encounter a great deal of decision-specific information on

certain proposed legislation, but time constraints will likely limit how much independent

research they can do on each of the several thousand pieces of legislation submitted in a

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 225/282

215

two or three month session. As a result, those legislators who suggested that they made

decisions in accordance with the traditional model may not realize that they too are

subject to the influence of preconscious processes.

And this lack of awareness of preconscious processes is the harm in promulgating

the traditional, purely conscious model of decision making. If instead we acknowledged

that policy decisions, even among elected officials and doctoral students, are not in all

instances the product of a conscious examination of relevant decision-specific

information, we as decision makers might be more careful and systematic when making

important decisions.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 226/282

216

Chapter VII

SUMMARY, CONCLUSIONS AND IMPLICATIONS

Summary and Conclusions

Based on a review of research in social psychology and neuroscience, there is

evidence that decision making about complex policy questions might be influenced by

 preconscious processes (Bargh et al., 1996; Damasio, 1994; Epstein, 1990; Haidt, 2001;

 Nisbett & Wilson, 1977; Zajonc 1980). Although the research did not directly address

decision making, this work implied that the traditional, purely conscious model of 

 political decision making was incomplete. The present study was designed to investigate

directly whether certain findings from social psychology and neuroscience research could

 be extended to decision making, and whether there was evidence that decisions about

complex policy questions were influenced by preconscious processes.

The first research question concerned whether preconscious processes influenced

complex decisions and the second and third research questions concerned how decisions

and decision makers differed. On the first question, there was evidence that participants’

decisions about two legislative proposals to improve academic achievement were

influenced by one or more of the following preconscious processes: a visceral response to

the proposal; political or relevance cues in the decision questions that activated existing

 preferences or principles; prior schema on how this type of issue was to be handled; a

 judgment heuristic based on recent and accessible relevant experience; and an overall

evaluative tally based on prior consideration of the specific or related legislative

 proposals. Consequently, the findings of this study challenge the descriptive validity of 

traditional, purely conscious models of reasoning and decision making.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 227/282

217

This conclusion about the descriptive validity of the traditional model is based on

three sources of evidence of preconscious processes. The first is the data on legislators’

decision latencies and analysis times for both decisions. Legislators made their decisions

and offered reasons for their decisions quickly (Table 2). Although the difference was not

statistically significant, legislators made decisions more quickly than they offered reasons

to support the decision. If the traditional model was accurate, the results should have

shown decision latencies that were longer than analysis times, as was the case with

adjusted decision latency and adjusted analysis time for graduate students on the

 privatization decision . This would be the case if the traditional model was accurate

 because under that model reasoning precedes decision making in all cases, so decision

latency would encompass analysis time and, therefore, be longer than analysis time in all

cases. However, the unadjusted values for graduate students are not evidence that

graduate students decided in a manner inconsistent with the traditional model.

The second source of evidence against the traditional model was the relation

 between participants’ certainty and the amount and quality of information they offered in

support of their decisions. Legislators, on both decisions, and graduate students, on the

class size decision, reported a high level of certainty that their policy decision was

correct, even though they offered few justifications to support these decisions and in

many cases offered only personal evidence in support. Additionally, on the privatization

decision legislators reported a high level certainty, almost identical to their certainty on

the class size decision, even though they reported a significantly lower amount of 

knowledge on the privatization decision. This disconnect suggested that legislators’

certainty judgments were not in all cases based on a conscious evaluation of their state of 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 228/282

218

knowledge about the decision topics, but rather on an affective signal on how much they

felt they knew. By comparison, for the privatization decision graduate students reported a

significantly lower level of certainty in their decision along with a significantly lower 

appraisal of their own knowledge on the topic. The data on graduate students’ certainty

and self-assessed knowledge for the privatization decision were not inconsistent with the

traditional model.

A third source of evidence against the traditional model was participants’

selection of decision models and their comments about their own decision making. The

vast majority of participants selected the intuitive decision making and reasoning

(IDMR) model over the traditional model to describe how most people and how they

themselves made political decisions. Also, what participants said about their own

decision-making processes, as described in Chapter VI, offered strong support for the

conclusion that the traditional model does not describe how people make political

decisions. For example, many legislators made clear that deciding in accordance with the

traditional model would be impossible given the time constraints of the legislative

session.

On the second and third questions, there was evidence that participants’ decision

making about the two decision questions differed and that, as a group, legislators and

doctoral students made their decisions differently. These results, including the evidence

cited in the preceding paragraphs, suggest that decision making may be a decision-

specific process, including whether or not the proposal evokes a visceral response or 

what one’s professional experience and personal principles and goals suggest is the

superior decision. Similarly, legislators and students did not appear to decide differently

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 229/282

219

 because one group had more decision-specific information about how class size limits or 

 privatization might affect academic achievement, but rather because legislators thought

about things doctoral students did not or because graduate students were less certain

about decisions for which they reported knowing less. For instance, legislators thought

about how supporting privatization might harm their next election campaign. No doctoral

student was concerned with reelection.

The differences between decision questions and between legislators and graduate

students led to the first of two general conclusions about the data. While these differences

were not as obvious as hypothesized, they focused attention on the range of individual

differences among decision makers. As a result, it became apparent how decision-

specific and individual-specific the process of making a policy decision can be. It was

difficult to discern patterns in the way participants made the class size decision or the

 privatization decision, or to distinguish legislators’ and graduate students’ responses to

the interview questions. There were important differences, as discussed in Chapter V, but

the most important conclusion may be that the process of making a policy decision is

 based on the experiences, information, values, principles, and goals that distinguish

 people, so the process ends up being idiosyncratic. Notwithstanding the individual-

specific characteristics of participants’ decision making, they all shared one feature: no

one made either decision in a systematic way.

Participants did not make their decisions in a step-by-step manner by considering

the stated goal of improving academic achievement, weighing how well opposition to

and support for the legislative proposal would achieve that goal, considering the costs

and other consequences of each course of action, and only then reporting a decision. No

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 230/282

220

 participant wrote anything down while making their decisions, and no one indicated that

they would need to do so to make a decision. Based on these results, a second general

conclusion can be offered. When faced with a complex question, to make a sound

decision that is consistent with your expectations and interests you must make the

decision by following certain well-defined steps, asking the sorts of questions set forth on

the decision map in Appendix E. Participants were rarely concerned about whether their 

answers were based on well-supported reasons. Also, although this study did not explore

this issue, certain legislators’ and graduate students’ responses suggested that there was

little room for persuasion, and that better evidence might not be sufficient to move

 participants to revise their decisions.

Finally, with regard to traditional models of choice, there was no evidence that

any of the participants in the study were calculating and maximizing subjective expected

utility in accordance with the traditional model. To maximize expected utility when

making a decision a decision maker has to consider all reasonable decision alternatives,

evaluate the subjective utility of each alternative, calculate the probability of each

alternative occurring, and then select the alternative with the highest subjective expected

utility (i.e., subjective utility multiplied by the probability or expectation of occurrence

equals subjective expected utility). Given that the sample was composed of highly

educated professionals, the lack of any evidence that anyone was maximizing utility is a

 blow to the dominant decision model. This finding recalls Slovic’s (1991, p. 500)

conclusion that, “The normative assumption that individuals should maximize some

quantity may be wrong. Perhaps . . . there exists nothing to be maximized” (Slovic, 1991,

 p. 500).

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 231/282

221

Limitations

This study was designed to address certain gaps in the decision literature

concerning how complex decisions are made, with specific emphasis on the absence of 

 preconscious processes. Of course, given the complexity of decision-making processes

and the inadequacy of any effort to study them, this study is not without limitations. The

 principal limitation of this study is that it sought to examine hidden, preconscious

 processes that are difficult to measure. For instance, this study investigated preconscious

 processes based, in part, on participants’ conscious responses to interview questions. If 

 participants were sometimes unaware of the preconscious processes that influenced their 

decision making, as hypothesized, then participants would not be able report the

influence of such processes in all instances. Asking participants to report processes of 

which they may not be aware stands as a significant limitation.

Another important limitation on this study was the choice of a legislative sample

and the constraints imposed by this sample. Interviewing legislators in a study of 

 preconscious processes imposes limits on how the data could be collected, which

ultimately limits the inferences that can be drawn from the data collected. For example,

cognitive task analyses are not ordinarily done in the way in which participants were

interviewed in this study, nor are response times measured using interview recordings

and a stopwatch, since these methods introduce human error. As a result, there may be a

mismatch between my methodology and the inquiry into the role of preconscious mental

operations, because the constraints associated with including legislative participants.

While self processes likely played a role in participants’ decision making and

their participants’ background and experiences likely influenced their decisions and

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 232/282

222

reasoning, this study did not examine how characteristics like age, political affiliation,

gender, professional experience, committee membership, or legislative experience may

have influenced participants’ policy decisions. A dedicated examination of participants’

experience and education as the “presage” that shaped their decision making would have

enhanced the present inquiry. Without such an examination, which was limited in this

study to a comparison of how legislators and doctoral students made decisions, this

research reveals little about how an individual’s unique experiences and education shape

that individual’s decision making about complex policy questions.

Also, it is possible that the decision questions or the interview procedures forced

 participants to make a decision they might otherwise not make for lack of information, or 

to reach a decision more quickly than they otherwise might (a demand effect), causing

their decisions as part of the study to appear to be the result of preconscious processes

when their decisions in different circumstances conform to traditional conscious-

reasoning-only decision models. For instance, participants in this study were not given

any information on the decision topics to assist their decision making and the decision

questions did not offer participants the alternative of not making a decision. This concern

is mitigated, however, by the fact that the decision questions and the decision settings in

the present study were similar to actual decisions participants make and the settings in

which they actually make them. Furthermore, legislators’ responses in particular gave no

indication that they treated the legislative proposals in this study any differently than

actual proposals they encountered as legislators.

There was also likely a selection bias in who participated in the study. This was

not a random study of adults, as letters or emails were sent to request the participation of 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 233/282

223

specific individuals. Only interview those who agreed to participate could be

interviewed. If those who agreed to participate were different in some material way in

their decision-making and reasoning processes from those who did not, the evidence

collected would be misleading to some extent. A related concern is the issue of social

desirability. Participants were asked to explain and support their decisions on policy

questions, and participants may have felt some pressure from within or assumed some

 pressure from without to offer reasonable and sound explanations for their decisions,

whether or not those explanations were the ones that led to their policy decisions.

This investigation was designed to measure participants’ prior information

(referred to alternatively as decision-specific information, justifications, evidence or 

rationale) for each decision topic. There were, however, limitations associated with

measuring prior information. First, although the information was referred to as “decision-

specific,” the data collected did not make it possible to distinguish in all cases whether 

the decision maker had relied on the evidence offered in support of a policy decision

 before making the decision, or whether that evidence was generated after the decision

was made.

A second limitation of measuring prior information is that the interview questions

asked participants for their evidence or reasons for a decision, but the questions were not

drafted to explore in a probing and persistent way the limits of participants’ existing

information on these decision topics. In not pressing respondents for more information, it

was difficult to determine whether participants knew more than they said; without

challenging their evidence it was difficult to determine if participants knew less than they

said. The interview protocol may not have collected evidence in all cases that would

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 234/282

224

make it possible to distinguish between those participants who had little knowledge about

the decision topic and those who had considerable knowledge about the topic but who,

 because of their response style or personality, did not volunteer all that they knew about

the topic when asked to explain their decision. Since participants’ evidence and reasoning

were not challenged, it was not possible to determine the true extent of participants’

knowledge about the decision topic. This study simply had to rely on what they said.

Implications for Practice and Research

Implications for Practice

The practical implications of this study are discussed for three groups: decision

makers, educators and educational researchers, and students. Based on the findings of 

this study, any decision maker who is making a decision of consequence for herself or for 

others must keep in mind that a decision that feels certain was not necessarily the product

of sound reasoning. Unless decision makers systematically scrutinize their important

decisions, the data show that even highly educated legislators and doctoral students will

make complex policy decisions and be certain about those decisions with scant evidence

or deliberation.

Left unexamined, quick decisions on complex policy questions may lead to

consequences that are contrary to the decision maker’s expectations and best interests.

What is important about this is that unless the decision maker exerts some conscious

control to make the process systematic, those things that become active when she is

asked to make a decision (e.g., a visceral response, the most recent or accessible

information in memory, the first thing you hear on a subject) will govern the decision,

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 235/282

225

which may not produce a rational outcome, or even the decision maker’s desired

outcome.

And based on the overall evaluative tally, it is possible that a poorly made

decision could become entrenched as an unexamined truism that dominates future

decisions. The only way for decision makers to ensure that their decisions are as accurate

and reasonable as possible under the circumstances is for decision makers to go through a

conscious process of considering goals, alternatives, and consequences, as well as trying

to identify preconscious biases or tendencies. As an example of systematic decision

making, a “decision map” (Appendix E) was prepared to illustrate what decision makers

might consider to improve their decisions–time and other resources permitting.

For educators and educational researchers this study has at least two implications.

The first is that educators and educational researchers are like other decision makers, so

they must be aware that preconscious processes influence their own decisions and

reasoning. Second, educators and educational researchers are responsible for educating

students in primary, secondary and higher education. This study reveals that highly

educated adults relied on preconscious processes to make difficult decisions, so formal

education as it is presently constituted does not seem to teach students at any academic

level to make important decisions in a deliberate and systematic way. In other words,

formal education does not teach students to make important decisions well.

Before educators and researchers can determine how to teach students to make

important decisions well, studies like this one must clarify how adults make important

decisions, to determine what adult decision makers’ practices are and what the defects in

these practices may be. Once the processes and defects are identified, educators and

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 236/282

226

researchers can set about the process of designing instruction to improve how students

make important decisions. While this study suggests that decision makers do not make

systematic decisions on complex policy questions, there is no reason to believe that we

cannot make systematic decisions.

Implications for Research

This study has important implications for research in decision making, choice,

 persuasion, political science, and education. To my knowledge, this is the first study of 

 political decision making and reasoning that interviewed legislators about how they make

 policy decisions. This is also the first study of decision making to propose an alternative

decision model and to proceed from the hypothesis that complex decisions are subject to

 preconscious influences, so that policy decisions may not in all cases be the product of 

conscious reasoning about abstract information. As such, this study is only a first step.

The study can be improved and it can be extended to other research areas.

The principal methodological challenges for future research are measuring

 preconscious processes (given the limitations of self-report data) and their influence on

complex decisions and doing so without allowing the same processes to color the

collection and interpretation of data on preconscious processes. My experience with this

study suggests that the complexity of the subject and the influence of preconscious

 processes on any one researcher’s analyses almost demand a team approach.

A single researcher’s ability to understand data is limited by his unique

experiences, theories and principles, and his interpretations of and conclusions about the

data are more vulnerable to the influence of preconscious processes than the work of a

group of researchers would be. This is the case because having several people collect,

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 237/282

227

analyze and interpret data would reduce the likelihood that one person’s biases and

intuitions determined what the data meant. Since one of the findings of this study is that

 participants made decisions on identical decisions in different ways, based on different

values, beliefs, information and experience, there is reason to believe that a single

researcher’s conclusions about decision making data might be similarly idiosyncratic. We

now turn to the question of future research on decision making, choice and persuasion.

How decisions are made, how information, experience, and beliefs, among other 

things, interact to make and revise decisions, and how much of the process is available to

conscious control and improvement are three important questions that merit continued

study. Once the decision-making process on difficult questions is described more

accurately, the most important question for educators becomes whether decision makers

can improve their decisions by being more systematic, that is, by following a limited

number of decision guidelines designed to limit the influence of unexamined values,

 beliefs, or factors. From the perspectives of political science, choice, and persuasion

research, it is critical that researchers keep in mind the possible influence of preconscious

 processes on the choices people make and the circumstances under which they may be

willing to change their minds. Based on the literature reviewed in Chapter II and

legislators’ comments about their high levels of certainty, political scientists and choice

researchers should consider how decisions are formed, how information processing is

influenced by processes that operate beneath conscious awareness, and how limited

symbolic information alone (i.e., written material) might be in educating and persuading

 people.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 238/282

228

Appendix A

Instructions, Decision Questions and Interview protocol

INSTRUCTIONS

Opening instructions – Before the interview begins

Good (morning or afternoon). I appreciate your agreeing to take the time to speak with me. As you know, you are participating in a study of political reasoning. The

 procedure is as follows: I will ask you for your decision on one topic and ask you about

10 questions about your decision. Then I will ask you for your decisions on a secondtopic and ask you the same 10 or so questions following that decision. Please answer 

these questions as well as you can. After this part of the interview is complete, I will ask 

for your feedback.

This process will take a total of about 45 to 60 minutes and it will be tape

recorded. I must proceed through the interview by adhering to the questions in front of 

me, and I cannot divulge any details about the content of the questions before I ask them, but I would be happy to answer any questions about the study after the interview is

complete.

As you know from the informed consent form, your responses are confidential.

Do you have any questions or concerns before we begin?

(If no) Let us begin.

 Debriefing instructions - After the interview is over 

Your interview is now complete. Thank you again for your participation. To

 preserve the integrity of this research, I ask that you not speak with anyone about thequestions, answers or format of the interview you just completed until I finish

interviewing other participants. If you discuss the interview with any of them, it will

undermine the study.

Do you have any questions?

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 239/282

229

DECISION QUESTIONS

Would you support or oppose legislation to limit class size to 25 students in all [name of 

state] public schools as a means to improve academic achievement?

Would you support or oppose legislation to transfer management and control of public

schools in your county or legislative district from the local school board to a private

company as a means to improve academic achievement?

INTERVIEW PROTOCOL

Part 1.

1. Why would you [support/oppose] such legislation?

Follow-up probe to elicit more information:

a. What is your decision [in support/in opposition] based on, e.g., specific

studies, committee reports, personal experiences?

2. Suppose now that one or more colleagues disagreed with your decision regarding

this legislation. What evidence might they give or what arguments might theymake in [opposing/supporting] the legislation?

3. How sure are you that your decision regarding the legislation is correct? Notcertain, Somewhat uncertain, Somewhat certain, or Certain?

4. Do you think [decision topic] policy experts know for sure what the correctdecision about the legislation is?

a. (If no) Would it be possible for experts to find out for sure if they studied

this problem long and carefully enough?

i. (If no) Why do you say this?

5. Have you ever considered or discussed this proposal with anyone before today?

a. (If no) Does this topic remind you of anything you have thought about or discussed previously?

 b. (If yes) How knowledgeable would you say you are about this [decision

topic] proposal, on a scale from 0 to 4, with 0 representing no prior knowledge and 4 representing expertise?

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 240/282

230

6. When I first asked you this question about this [decision topic] legislation, did it

 bring to mind any positive or negative feelings, ideas or images?

a. (If yes) What were those feelings, ideas or images? Please be as specificas you can.

7. Do you think this [decision topic] legislation is better characterized as a liberal or 

a conservative position?

8. Looking back, how quickly did you make your decision? Instantaneously,

Quickly, Deliberately, or Slowly?

Part 2.

1. What is the most important issue or legislation that must be addressed by the

[legislature] to improve academic achievement in [name of state] public schools?

 Participants were shown the two model diagrams in Figures 1 and 2 while thedifferences between the two models were described. They were then asked the following 

questions:

2. Which model more accurately describes how people make decisions?

3. Which model best explains your decisions earlier in this interview?

a. (If IDMR Model) Which of the influences on intuitive decisions described

in the model do you consider to have had the greatest influence on your responses to each question?

4. Overall, how would you rate your experience as a participant in this study, on a

scale from 0 to 4, with 4 being the highest score.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 241/282

231

Appendix B

Kuhn (1991) interview protocol (illustrated for crime topic)

Causal theory and justification

1. What causes prisoners to return to crime after they’re released?

a. (Probe, when subject completes initial response) Anything else?

2. (If multiple causes mentioned) Which of these would you say is the major cause

of prisoners returning to crime?

3. How do you know that this is the cause?

a. (Probe, if necessary) Just to be sure I understand, can you explain exactly

how this shows that this is the cause?

4. If you were trying to convince someone else that your view [that this is the cause]is right, what evidence would you give to try to show this?

a. (Probe, if necessary) Can you be very specific, and tell me some particular facts you could mention to try to convince the person?

5. Is there anything further you could say to help show that what you’ve said iscorrect?

6. Is there anything someone could say or do to prove that this is what causes prisoners to return to crime?

7. Can you remember when you began to hold this view?

a. (If no) Have you believed it for as long as you can remember?

 b. (If yes) Can you remember what it was that led you to believe that this isthe cause?

Contradictory positions

8. Suppose now that someone disagreed with your view that this is the cause. What

might they say to show that you were wrong?

9. What evidence might this person give to try to show that you were wrong?

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 242/282

232

a. (Probe, if necessary) Just to be sure I understand, can you explain exactly

how this would show that you were wrong?

10. (If not already indicated) Is there any fact or evidence which, if it were true,would show your view to be wrong?

11. Could someone prove that you were wrong?

12. (Omit if alternative theory already generated) A person like we’ve been takingabout whose view is very different from yours – what might they say is the major 

cause?

13. (Include if no alternative theory generated) Suppose that someone disagreed with

you and said that___________ was the cause. What could you say to show that

this other person was wrong?

a. (Probe, if necessary) Just to be sure I understand, can you explain exactly

how this would show the person was wrong?

14. Would you be able to prove this person wrong?

a. (If not already indicated) What could you say to show that your own viewis the correct one?

Instrumental reasoning

15. Is there any important thing which, if it could be done, would lessen prisoners’

returning to crime?

16. Why would this lessen it?

Epistemological reasoning

17. How sure are you about what causes prisoners to return to crime?

18. Do experts know for sure what causes prisoners to return to crime?

a. (If no) Would it be possible for experts to find out for sure if they studiedthis problem long and carefully enough?

19. How sure are you of your view, compared to an expert?

20. Is more than one point of view possible regarding the question of what causes

 prisoners to return to crime?

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 243/282

233

21. (If yes) Could more than one point of view be right?

22. How much would you say you know about this topic, compared to the average

 person?

23. How important is this topic to society as a whole?

24. How important is this topic to you personally?

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 244/282

234

Appendix C

Variable Selection and Revisions to Variables and Coding Schemes

As with the interview protocol, the starting point in determining which variables

to measure was Kuhn’s (1991) study of reasoning about causal theories. Based on Kuhn’s

interview protocol and the categories of reasoning she and her team developed to

measure the data they collected, the present study measured evidence, counterarguments,

certainty, epistemological understanding, and self-assessed knowledge. Since the

 possibility of preconscious influences on decision making was also to be explored,

response times (decision latency, analysis time, counterargument latency, and partisan

latency), affect, reported speed to decision, argument repertoire and choice of decision

model were added to the list of variables to be measured. After reviewing interview

transcripts from legislators, it became clear that certain variables and coding schemes

 based on Kuhn’s work needed to be revised or replaced. Specifically, the variables

relating to evidence and counterarguments had to be revised. How these new variables

and coding schemes were developed is detailed in this Appendix.

For each interview recording, a written transcript was prepared. One transcriber 

completed all the legislative interviews, while a second completed all the student

interviews. The transcripts were read while listening to the interview recordings to

confirm the accuracy of the transcripts.

After preparing a corrected transcript for each legislator, a document that served

as a template to organize the legislators’ responses into a format that facilitated coding

was prepared. Using this template, the legislator’s responses to each interview question

were reorganized so that the template contained the segment of transcript that

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 245/282

235

corresponded to each of the variables to be measured. Having listened to each legislators’

transcripts at least three times, as their responses were organized into segments for each

variable additional comments were added about their responses. This first round of open

coding involved observations about patterns in a legislator’s responses, how the legislator 

compared with other legislators, the types of reasons and sources of evidence that

legislators offered, how a legislator’s personal and professional experiences shaped their 

responses and so on.

After completing a template for every legislator, three legislators’ transcripts were

selected randomly to assess the suitability of the coding scheme developed based in part

on Kuhn (1991). A review of these three transcripts revealed that the coding schemes for 

evidence, counterarguments, prior knowledge and reason content and quality were not

suitable. As the result of the iterative process of reading the three randomly selected

transcripts and revising the variables in question, a more suitable scheme was developed

to measure what legislators knew about the decision topics and how they justified their 

decisions. Additionally, a method was settled upon for measuring analysis time, while

counterargument latency and partisan latency were added to the list of variables to be

measured. Once these changes had been made, the revised variable list and two

legislators’ transcripts in template format were sent to the chairperson and another 

member of the author’s dissertation committee. After reviewing the revised variables in

connection with the transcripts, this committee member suggested minor changes to the

revised variables. These recommendations were incorporated.

As a result of this process of revising variables, “evidence” was revised as “citing

evidence.” “Prior knowledge” and “reason content and quality” were revised as

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 246/282

236

“justificatory rationale.” The categories within “counterarguments” were also revised.

The method for measuring analysis time was determined and two variables

(“counterargument latency” and “partisan latency”) were added to compare with

“decision latency.” Finally, “argument structure” was removed.

Decisions and Reasons

In analyzing participants’ responses to Question 1 and the follow-up probe, the

focus was on whether policy decisions were based on external sources of information or 

on personal experience or beliefs. In terms of soundness and accuracy, empirical research

and committee reports, two examples of external sources of information, are superior to

 personal experience, principles or beliefs that are offered without any mention of an

extrinsic source of support. In other words, an educational policy decision that is justified

on the basis of published research is more likely to be sound than a decision that is

 justified on the basis of what the decision maker believes to be true without any reference

to the source of such belief.

Based on this assumption, the evidence categories were changed to make the

distinction between external and personal evidence clear. Similarly, prior knowledge

should be rated according to the source and quality of the justifications participants offer 

to support their decisions. Thus categories of justificatory rationale were drafted to

classify the source and quality of participants’ justifications along a continuum of 

descending quality as follows: controlling law, professional publication, general

 publication, data, professional experience, personal experience, and vague. Once this

scheme was created to rate the content and quality of the support offered for participants’

 justifications, a separate variable for reason content and quality was no longer necessary.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 247/282

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 248/282

238

Table A2

 Number and Percentage of Legislators Citing External Evidence, Personal Evidence and

 Nonevidence in Response to Interview Question 1 and Subsequent Probe for Detailed

Information

Category

Class Size Privatize

Question 1 Probe Question 1 Probe

 f  %  f  %  f  %  f  %

External 4 9.8 12 29.3 0 0 8 19.5Personal 24 58.5 7 17.1 29 70.7 15 36.6

Both 12 29.3 6 14.6 10 24.4 3 7.3

 Non 1 2.4 5 12.2 2 4.9 4 9.8

 Not

applicable

0 0 11 26.8 0 0 11 26.8

Total 41 100 41 100 41 100 41 100

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 249/282

239

Table A3

 Number and Percentage of Graduate Students Citing External Evidence, Personal

Evidence and Nonevidence in Response to Interview Question 1 and Subsequent Probe

for Detailed Information

Evidence

Category

Class Size Privatize

Question 1 Probe Question 1 Probe

 f  %  f  %  f  %  f  %

External 1 5.6 2 11.1 2 11.1 6 33.3

Personal 13 72.2 8 44.4 15 83.3 5 27.8

Both 2 11.1 2 11.1 1 5.6 1 5.6

 Non 0 0 2 11.1 0 0 2 11.1

 Not

applicable

2 11.1 4 22.2 0 0 4 22.2

Total 18 100 18 100 18 100 18 100

 Note. In tables A2 and A3, the term "probe" refers to the follow-up probe that asked

 participants for the specific evidence on which their decision was based, whether specific

studies, committee, reports, or personal experience. The evidence participants reported is

 presented separately for Question 1 and for the follow-up probe, because each question in

the interview protocol served a different purpose. Question 1 was drafted to avoid

 priming any specific sources of evidence to measure what the participant herself reported

without prompting. Question 1, therefore, was more likely to measure the evidence that

actually influenced the reported decision. Not all participants were asked to answer the

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 250/282

240

follow-up probe. For instance, 11 of the 41 legislators are listed as “not applicable.” In

those interviews where a participant offered specific grounds for their decision in

response to Question 1, the follow-up probe became unnecessary.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 251/282

241

Table A4

 Number and Percentage of Legislators and Graduate Students Offering Specific Types

External and Personal Justificatory Rationale in Support of their Policy Decisions

Citing Evidence

Justificatory Rationale

Legislators Graduate Students

Class Size Privatize Class Size Privatize

 f  %  f  %  f  %  f  %

External Evidence

Controlling Law 1 2.4 1 2.4 0 0 0 0

Professional Publ. 3 7.3 3 7.3 3 16.6 0 0

General Publication 2 4.8 3 7.3 2 11.1 3 16.6

Data 25 60.9 10 24.3 1 5.5 4 22.2

Personal Evidence

Professional Exp. 11 26.8 8 19.5 7 38.8 5 27.7

Personal Exp. 36 87.8 39 95.1 15 83.3 16 88.8

Only Personal Exp. 11 26.8 18 43.9 7 38.8 8 44.4 Nonevidence

Vague 0 0 1 2.4 0 0 0 0

 Note. If a participant cited external evidence they encountered in the course of their work 

as a legislator or graduate student but did not cite the specific source of that information,

the professional experience was coded as external evidence. The frequencies do not add

up to 41 for legislators and 18 for graduate students because in many cases participants

reported more than one type of rationale in support of their decision, so the same

individual could be represented in multiple categories on Table A4. For example,

Graduate Student 9 decided to oppose the proposal to privatize public schools and

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 252/282

242

offered external evidence in the form of lessons learned from professional experience as

a teacher and graduate student and personal evidence in the form of personal beliefs

about control of public school. This student is represented twice in Table A4, once under 

 professional experience and again under personal experience.

Table A5

 Number and Percentage of Legislators and Graduate Students Generating

Counterarguments

Category

Legislators Graduate Students

Class Size Privatize Class Size Privatize

 f  %  f  %  f  %  f  %

Specific 9 21.9 5 12.1 9 50.0 2 11.1

Relevant 31 75.6 27 65.8 13 72.2 14 77.7

Unsuccessful 1 2.4 9 21.9 3 16.6 4 22.2

 Nonattempt 3 7.3 3 7.3 1 5.5 0 0

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 253/282

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 254/282

244

Table A7

 Number and Percentage of Legislators and Graduate Students Selecting Traditional

Model, IDMR Model or Both to Describe How Most People Make Political Decisions

Model

Legislators Graduate Students

 f  %  f  %

Traditional 1 2.4 0 0

IDMR 24 58.5 16 88.9

Both 6 14.6 2 11.1

Unclear 8 19.6 0 0

 NA 2 7.3 0 0

Total 41 100 18 100

Table A8

 Number and Percentage of Legislators and Graduate Students Selecting Traditional

Model, IDMR Model or Both to Describe How They Themselves Make Political

Decisions

Model

Legislators Graduate Students

 f  %  f  %

Traditional 5 12.2 1 5.6

IDMR 16 39.0 11 61.1

Both 11 26.8 6 33.3

Unclear 9 22.0 0 0

 NA 0 0 0 0

Total 41 100 41 100

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 255/282

Table A9

Individual Legislator’s and Graduate Student’s Data for Class Size and Privatization Decisions

ID

Decision Decision

Latency(seconds)

Analysis

Time(seconds)

Evidence and Word

Count in Response toQuestion 1

Argument

Repertoire(number)

Certainty

(0 to 3scale)

Self-

AssessedKnowledge

(0 to 4 scale)

Affect Reported

Speed toDecision

(0 to 3

scale)

CS P CS P CS P CS P CS P CS P CS P CS P CS P

Legislators

1 Op Op 4 3 0 0 Pers

155

Pers

241

2 8 3 3 0 0 No No 3 2

2 Su Op 0st 1 0st 3 Non

173

Pers,Ext

374

4 na 2 0 0 0 Yes Yes 1 0

3 Op Op 9 0 8 25 Pers

394

Pers,Ext

99

2 4 2 2 na 2 Yes Yes 3 2

4 Op Op 4st 0 4st 0 Pers

120

Pers

60

2 4 1 3 0 0 Yes Yes 3 2

5 unc Op 0st 0 0st 0 Pers

82

Pers,Ext

118

1 4 3 3 0 0 nr nr 1 0

6 Su Op 0 1 0 3 Pers

45

Pers

103

2 3 2 2 3 2 Yes Yes 1 1

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 256/282

Table A9 continued 

ID

Decision Decision

Latency

(seconds)

Analysis

Time

(seconds)

Evidence and Word

Count in Response to

Question 1

Argument

Repertoire

(number)

Certainty

(0 to 3

scale)

Self-

Assessed

Knowledge

(0 to 4 scale)

Affect Reported

Speed to

Decision

(0 to 3

scale)

CS P CS P CS P CS P CS P CS P CS P CS P CS P

7 Su Op 1 0 0 1 Ext

50

Pers

239

2 3 3 3 3 2 No Yes 3 3

8 Op Op 0 8 0 1 Pers

232

Pers

41

2 3 3 3 0 0 na nr 0 3

9 Op Op 0 8 1 14 Pers

97

 Non

55

0 na 3 0 2.5 0 Yes Yes 1 3

10 und Op na 6st 1 6st Pers

138

Pers,Ext

150

2 4 0 0 0 0 Yes No 2 1

11 Op Op 0 8 5 9 Pers

111

Pers

206

3 2 2 0 0 0 Yes Yes 3 2

12 Su Op 0 0 0 0 Ext

30

Pers

24

2 3 3 3 4 2 Yes Yes na na

13 Op Op 4 1 0 3 Pers

263

Pers

313

4 2 3 2 1 na No Yes na 1

14 Su Op 0 2 0 2 Pers

160

Pers

160

3 5 3 3 4 0 Yes nr na 2

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 257/282

Table A9 continued 

ID

Decision Decision

Latency

(seconds)

Analysis

Time

(seconds)

Evidence and Word

Count in Response to

Question 1

Argument

Repertoire

(number)

Certainty

(0 to 3

scale)

Self-

Assessed

Knowledge

(0 to 4 scale)

Affect Reported

Speed to

Decision

(0 to 3

scale)

CS P CS P CS P CS P CS P CS P CS P CS P CS P

15 Su Op 0st 0 0st 4 Pers

66

Pers

45

3 2 3 3 0 0 Yes No 3 3

16 Su Op 1st 0 1st 0 Pers

159

Pers

20

4 3 3 3 0 0 Yes Yes 1 1

17 Op Op 3st 0 3st 1 Pers

82

Pers

123

2 5 0 2 0 0 Yes Yes 2 3

18 Su Op 4 0 9 0 Pers,Ext

310

Pers

183

4 5 3 3 0 Yes Yes 3 3

19 Su Op 1 6st 4 6st Pers,Ext

55

Pers

158

3 4 3 2 4 3 Yes Yes 3 2

20 Su unc 0 3st 9 3st Pers,Ext

45

Pers

192

3 2 3 3 na na nr No 2 1

21 Su Op 0 7st 0 7st Pers

47

 Non

31

4 2 3 1.5 0 0 Yes Yes 3 0

22 Su Op 6st 5 6st na Pers

116

Pers

321

3 3 3 2 3 0 Yes Yes 2 2

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 258/282

Table A9 continued 

ID

Decision Decision

Latency

(seconds)

Analysis

Time

(seconds)

Evidence and Word

Count in Response to

Question 1

Argument

Repertoire

(number)

Certainty

(0 to 3

scale)

Self-

Assessed

Knowledge

(0 to 4 scale)

Affect Reported

Speed to

Decision

(0 to 3

scale)

CS P CS P CS P CS P CS P CS P CS P CS P CS P

23 Su Su na 0 na 6 Pers,Ext

218

Pers

416

7 6 3 3 2.5 3.5 No Yes na na

24 Su Op 0 0 0 3 Ext

13

Pers

11

4 2 3 3 3 0 Yes Yes 3 3

25 Su Op 0 3st 0 3st Pers,Ext

131

Pers

205

2 2 3 3 na na No Yes na 3

26 Su Op 0 0 0 0 Pers,Ext

107

Pers,Ext

65

3 2 1 3 3 3 No No na na

27 Su Op 0 0 0 5 Pers

25

Pers

176

4 4 3 1 na 0 Yes Yes 3 1

28 Op Op 0st 0 0st 0 Pers

55

Pers

129

1 2 3 3 na 0 Yes No 3 3

29 Su Op 4 1 0 4 Pers,Ext

213

Pers,Ext

74

5 3 3 2 3 0 Yes Yes 0 0

30 Su Op 0 6 1 4 Pers,Ext

205

Pers,Ext

173

2 3 3 3 0 0 Yes Yes 3 1

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 259/282

Table A9 continued 

ID

Decision Decision

Latency

(seconds)

Analysis

Time

(seconds)

Evidence and Word

Count in Response to

Question 1

Argument

Repertoire

(number)

Certainty

(0 to 3

scale)

Self-

Assessed

Knowledge

(0 to 4 scale)

Affect Reported

Speed to

Decision

(0 to 3

scale)

CS P CS P CS P CS P CS P CS P CS P CS P CS P

31 Su Op 0 0 0 0 Pers

184

Pers

76

5 3 2 3 3 0 Yes Yes na na

32 Su Op 6st 0 6st 3 Ext

67

Pers

93

4 5 2 2 3 0 Yes nr 2 2

33 Op Op 3st 0 3st 8 Pers

125

Pers

61

2 1 3 3 0 0 No Yes 3 3

34 Su Op na 0 na 3 Pers,Ext

162

Pers,Ext

150

na 6 0 3 3 2.5 nr Yes 2 3

35 Su Op 0 0 4 4 Pers

66

Pers

83

4 2 2 3 3 0 Yes Yes 3 3

36 Su Op 0st 0 0st 0 Pers,Ext

631

Pers,Ext

358

4 4 3 3 3 4 Yes Yes 3 3

37 Su Op 2 0 5 2 Pers

251

Pers

91

3 4 3 3 4 4 Yes Yes 3 na

38 Op Op 0 3 0 4 Pers

68

Pers

31

4 1 3 3 3 0 Yes Yes na 0

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 260/282

Table A9 continued 

ID

Decision Decision

Latency

(seconds)

Analysis

Time

(seconds)

Evidence and Word

Count in Response to

Question 1

Argument

Repertoire

(number)

Certainty

(0 to 3

scale)

Self-

Assessed

Knowledge

(0 to 4 scale)

Affect Reported

Speed to

Decision

(0 to 3

scale)

CS P CS P CS P CS P CS P CS P CS P CS P CS P

39 Su Op 0 2 0 0 Pers

200

Pers

115

5 2 3 2 2 0 Yes Yes 2 2

40 Su Op 0 3st 0 3st Pers,Ext

79

Pers

54

4 4 3 3 3.5 0 Yes Yes 3 na

41 Op Op 0 0 2 2 Pers,Ext

306

Pers,Ext

409

na 7 2 3 2 0 Yes No na 2

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 261/282

Table A9 continued 

ID

Decision Decision

Latency

(seconds)

Analysis

Time

(seconds)

Evidence and Word

Count in Response to

Question 1

Argument

Repertoire

(number)

Certainty

(0 to 3

scale)

Self-

Assessed

Knowledge

(0 to 4 scale)

Affect Reported

Speed to

Decision

(0 to 3

scale)

CS P CS P CS P CS P CS P CS P CS P CS P CS P

Graduate Students

1 Su Su 3 na 0 na Pers

11

Pers

55

1 2 3 1 4 0 Yes No 3 2

2 Su Op 0 1 4 2 Pers

148

Pers

265

3 3 3 2 2 2 Yes Yes 3 2

3 Su Op 93st 0 93st 1 Ext

244

Ext

100

3 4 na 3 3 na No Yes 0 3

4 Su Op 37st 82st 37st 82st na

137

Pers

180

3 4 na 1 0 0 Yes No 0 0

5 Su Op 1 na 4 10 Pers

25

Pers

na

3 5 3 0 4 1 Yes Yes 3 0

6 Su Op 31st 3 31st 8 na

104

Ext

167

1 3 3 0 3 0 No No 1 1

7 Su Op 0 5 4 1 Pers

49

Pers

60

2 1 2 2 2.5 0 Yes Yes 2 1

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 262/282

Table A9 continued 

ID

Decision Decision

Latency

(seconds)

Analysis

Time

(seconds)

Evidence and Word

Count in Response to

Question 1

Argument

Repertoire

(number)

Certainty

(0 to 3

scale)

Self-

Assessed

Knowledge

(0 to 4 scale)

Affect Reported

Speed to

Decision

(0 to 3

scale)

CS P CS P CS P CS P CS P CS P CS P CS P CS P

8 Su Op 0 6 0 2 Pers

42

Pers

35

4 3 2 1 0 0 Yes No 3 1

9 Su Op 2 1 1 1 Pers,Ext

67

Pers

40

2 4 3 2 3 0 Yes nr na 2

10 Su Op 1 1 1 5 Pers

63

Pers

126

6 3 2 3 0 0 Yes Yes 3 2

11 Su Op 0 3 0 5 Pers

71

Pers

69

2 1 3 2 2 0 Yes Yes 2 2

12 Su Op 9st 1 9st 17 Pers

181

Pers

159

3 3 1 3 0 0 Yes Yes 1 3

13 Op Op 1 2 1 4 Pers

38

Pers

64

1 3 3 3 0 0 Yes Yes 2 3

14 Op Op 7 0 1 1 Pers

62

Pers,Ext

35

5 3 2 3 2 0 Yes No 2 0

15 Su Su 5 1 na 1 Pers

na

Pers

na

2 1 2 1 0 2 No Yes 3 1

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 263/282

Table A9 continued 

ID

Decision Decision

Latency

(seconds)

Analysis

Time

(seconds)

Evidence and Word

Count in Response to

Question 1

Argument

Repertoire

(number)

Certainty

(0 to 3

scale)

Self-

Assessed

Knowledge

(0 to 4 scale)

Affect Reported

Speed to

Decision

(0 to 3

scale)

CS P CS P CS P CS P CS P CS P CS P CS P CS P

16 Su Op 4 0 0 8 Pers

40

Pers

59

4 3 na 3 2 2 Yes Yes 1 3

17 Su Su 0 4st 0 4st Pers,Ext

99

Pers

24

3 2 3 1 3 0 Yes Yes 3 3

18 Su Op 10 26st 1 26st Pers

101

Pers

141

3 2 2 0 3 0 Yes Yes 0 1

 Note. st Decision Latency and Analysis Time for these decisions were coded as the same number of seconds. CS = class size decision,

P = privatization decision, Op = oppose, Su = support, Ext = external evidence, Pers = personal evidence, Non = nonevidence, na =

did not ask question, unc = unclear, und = undecided, nr = not responsive.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 264/282

Table A10

Chi-Square Analyses of Certain Frequency Data

Variable

Legislators Graduate Students

Obs. N  Exp. N  Chi-square Obs. N  Exp. N  Chi-square

Class Size Decision (Oppose/Support) 12/26 19/19 5.58* 2/16 9/9 10.88***

Privatization Decision (Oppose/Support) 38/2 20/20 32.40*** 15/3 9/9 8.00**

Citing Evidence Class Size Decision

(Personal/External)

25/16 20.5/20.5 1.97 13/3 8/8 6.25*

Citing Evidence Privatization Decision

(Personal/External)

32/9 20.5/20.5 12.90*** 15/3 9/9 8.00**

Choice of Decision Model - Most People

(Traditional/IDMR)

1/30 15.5/15.5 27.12*** 0/18 18 Constant

Choice of Decision Model - Self 

(Traditional/IDMR)

5/27 16/16 15.12*** 1/17 9/9 14.22***

*p < .05. **p < .01. ***p < .001.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 265/282

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 266/282

256

8. Which answer or decision to the

Question is best? List your evidence.

9. What would someone who disagreed

with you say the correct answer to theQuestion is? List the reasons why.

10. What evidence would you offer to

convince those who disagree with you?

11. Has anyone faced this question or 

decision before you? If so, what can you

learn from their experience?

12. Are there alternatives you have not

considered?

OTHER CONSIDERATIONS:

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 267/282

257

References

Ackerman, P. & Beier, M. E. (2003). Trait complexes, cognitive investment, and domain

knowledge. In R. J. Sternberg & E. L. Grigorenko (Eds.), The psychology of 

abilities, competencies, and expertise (1-30). Cambridge, UK: Cambridge

University Press.

Ajzen, I. (1996). The social psychology of decision making. In E. T. Higgins & A. W.

Kruglanski (Eds.), Social psychology: Handbook of basic principles (297-325).

 New York: The Guilford Press.

Alexander, P. A. (1997). Mapping the multidimensional nature of domain learning: The

interplay of cognitive, motivational, and strategic forces. In M. L. Maehr & P. R.

Pintrich (Eds.), Advances in motivation and achievement (Vol. 10, pp. 213-250).

Greenwich, CT: JAI Press.

Alford, C. F. (2002). Group psychology is the state of nature. In K. R. Monroe (Ed.),

 Political psychology (pp. 193-205). Mahwah, NJ: Lawrence Erlbaum

Associates.

Anderson, J. R. (1987). Skill acquisition: Compilation of weak-method problem

solutions. Psychological Review, 94, 192-210.

Bargh, J. A., Chaiken, S., Govender, R., & Pratto, F. (1992). The generality of the

automatic attitude activation effect. Journal of Personality and Social 

 Psychology, 62(6), 893-912.

Bargh, J. A., Chaiken, S., Raymond, P., & Hymes, C. (1996). The automatic evaluation

effect: Unconditional automatic attitude activation with a pronunciation task.

 Journal of Experimental Social Psychology, 32, 104-128.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 268/282

258

Bargh, J. A., & Chartrand, T. L. (1999). The unbearable automaticity of being. American

 Psychologist, 54, 462-479.

Baron, J. (2000). Thinking and deciding (3rd ed) . Cambridge, UK: Cambridge

University Press.

Bartels, L. M. (1996). Uninformed votes: Information effects in presidential elections.

 American Journal of Political Science, 40(1), 194-230.

Boynton, G. R. (1995). Computational modeling: A computational model of a survey

respondent. In M. Lodge & K. M. McGraw (Eds.), Political judgment: Structure

and process (229-248). Ann Arbor, MI: University of Michigan Press.

Bereiter, C., & Scardamalia, M. (1993). Surpassing ourselves: An inquiry into the nature

and implications of expertise. Chicago: Open Court.

Beyer, B. K. (1985). Critical thinking: What is it? Social Education, 49, 270-276.

Buehl, M. M., & Alexander, P. A. (2001). Beliefs about knowledge. Educational 

 Psychology Review, 13(4), 385-418.

Buehl, M. M., Alexander, P. A., Murphy, P. K., & Sperl, C. T. (2001). Profiling

 persuasion: The role of beliefs, knowledge, and interest in the processing of 

 persuasive texts that vary by argument structure. JLR, 33(2), 269-301.

Calvin, W. H. (1996). How brains think: Evolving intelligence, then and now. New York:

Basic Books.

Cappella, J. N., Price, V., & Nir, L. (2002). Argument repertoire as a reliable and valid

measure of opinion quality: Electronic dialogue during Campaign 2000. Political 

Communication, 19, 73-93.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 269/282

259

Carlston, D. E., & Smith, E. R. (1996). Principles of mental representation. In E. T.

Higgins & A. W. Kruglanski (Eds.), Social psychology: Handbook of basic

 principles. New York: The Guilford Press.

Chaiken, S., & Trope, Y. (Eds.) (1999). Dual-process theories in social psychology. New

York: The Guilford Press.

Chase, W. G., & Simon, H. A. (1973). Perception in chess. Cognitive psychology, 4, 55-

81.

Cherniak, C. (1986). Minimal rationality. Cambridge, MA: The MIT Press.

Chi, M. T. H., Feltovich, P. J., & Glaser, R. (1981). Categorization and representation of 

 physics problems by experts and novices. Cognitive Science, 5, 121-152.

Clore, G. L. (1994). Why emotions require cognition. In P. Elkman & R. J. Davidson

(Eds.), The nature of emotion: Fundamental questions (pp. 181-191). New York:

Oxford University Press.

Cosmides, L., & Tooby, J. (1994). Beyond intuition and instinct blindness: Toward an

evolutionarily rigorous cognitive science. Cognition, 50, 41-77.

Damasio, A. R. (1994). Descartes’ error: Emotion, reason and the human brain. New

York: Avon Books.

Damasio, A. R. (1999). The feeling of what happens: Body and emotion in the making of 

consciousness. New York: Harcourt Brace & Company.

Denes-Raj, V., & Epstein, S. (1994). Conflict between intuitive and rational processing:

When people behave against their better judgment. Journal of Personality and 

Social Psychology, 66 (5), 819-829.

Dennett, D. (1991). Consciousness explained. Boston: Little, Brown and Company.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 270/282

260

Edelman, G. M., & Tononi, G. (2000). A universe of consciousness: How matter becomes

imagination. New York: Basic Books.

Ennis, R. H. (1991). Critical thinking: A streamlined conception. Teaching Philosophy,

March 1991, 5-24.

Epstein, S. (1990). Cognitive-experiential self-theory. In L. Pervin (Ed.), Handbook of 

 personality: Theory and research (pp. 165-192). New York: The Guilford Press.

Epstein, S. (1998). Constructive thinking: The key to emotional intelligence. Westport,

CT: Praeger.

Epstein, S., & Pacini, R. (1999). Some basic issues regarding dual-process theories from

the perspective of cognitive-experiential self-theory. In S. Chaiken & Y. Trope

(Eds.), Dual-process theories in social psychology (462-482). New York: The

Guilford Press.

Epstein, S., Pacini, R., Denes-Raj, V., & Heier, H. (1996). Individual differences in

intuitive-experiential and analytical-rational thinking styles. Journal of 

 Personality and Social Psychology, 71(2), 390-405.

Ericsson, K. A. (2003). The search for general abilities and basic capacities: Theoretical

implications from the modifiability and complexity of mechanisms mediating

expert performance. In R. J. Sternberg & E. L. Grigorenko (Eds.), The psychology

of abilities, competencies, and expertise (93-125). Cambridge, UK: Cambridge

University Press.

Evans, J. St B. T. (1996). Deciding before you think: Relevance and reasoning in the

selection task. British Journal of Psychology, 87 , 223-240.

Evans, J. St B. T., & Over, D. E. (1996). Rationality and reasoning . East Sussex, UK:

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 271/282

261

Psychology Press.

Fazio, R. H., Sanbonmatsu, D. M., Powell, M. C., & Kardes, F. R. (1986). On the

automatic activation of attitudes. Journal of Personality and Social Psychology,

50(2), 229-238.

Feldman, S. (1995). Answering survey questions: The measurement and meaning of 

 public opinion. In M. Lodge & K. M. McGraw (Eds.), Political judgment:

Structure and process (249-270). Ann Arbor, MI: University of Michigan Press.

Feltovich, P. J., Spiro, R. J., & Coulson, R. L. (1997). Issues of expert flexibility in

contexts characterized by complexity and change. In P. J. Feltovich, K. M. Ford

& R. R. Hoffman (Eds.), Expertise in context: Human and machine (125-146).

Menlo Park, CA: AAAI Press/The MIT Press.

Finucane, M. L., Alhakami, A., Slovic, P., & Johnson, S. M. (2000). The affect heuristic

in judgments of risks and benefits. Journal of Behavioral Decision Making, 13, 1-

17.

Fischer, G. W., & Johnson, E. J. (1986). Behavioral decision theory and political

decision making. In R. R. Lau & D. O. Sears (Eds.), Political cognition: The 19th

annual Carnegie symposium on cognition (55-65). Hillsdale, NJ: Lawrence

Erlbaum Associates.

Fischhoff, B. (1991). Value elicitation: Is there anything in there? In D. Kahneman & A.

Tversky (Eds.) (2000), Choices, values, and frames (pp. 620-641). Cambridge,

UK: Cambridge University Press.

Fishkin, J. S., & Laslett, P. (Eds.) (2003). Debating deliberative democracy. Oxford:

Blackwell Publishing.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 272/282

262

Friedman, J. (Ed.) (1996). The rational choice controversy: Economic models of politics

reconsidered. New Haven, CT: Yale University Press.

Ganzach, Y. (2000). Judging risk and return of financial assets. Organizational Behavior 

and Human Decision Processes, 83(2), 353-370.

Geva, N., Mayhar, J., & Skorick, J. M. (2000). The cognitive calculus of foreign policy

decision making: An experimental assessment. Journal of Conflict Resolution,

44(4), 447-471.

Ghirardato, P. (2001). Coping with ignorance: Unforeseen contingencies and non-

additive uncertainty. Economic Theory, 17 , 247-276.

Gilbert, D. T., Tafarodi, R. W., & Malone, P. S. (1993). You can’t not believe everything

you read. Journal of Personality and Social Psychology, 65(2), 221-233.

Gilens, M. (2001). Political ignorance and collective policy preferences. American

 Political Science Review, 95(2), pp. 379-396.

Gilovich, T., & Griffin, D. (2002). Introduction -- Heuristics and biases: Then and now.

In Gilovich, T., Griffin, D., & Kahneman, D. (Eds.) (2002) , Heuristics and 

biases: The psychology of intuitive judgment. Cambridge, UK: Cambridge

University Press.

Gilovich, T., Griffin, D., & Kahneman, D. (Eds.) (2002). Heuristics and biases: The

 psychology of intuitive judgment. Cambridge: Cambridge University Press.

Granberg, D. (1993). Political perception. In S. Iyengar & W. J. McGuire (Eds.),

 Explorations in political psychology (70-112). Durham, NC: Duke University

Press.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 273/282

263

Green, D. P., & Shapiro, I. (1994). Pathologies of rational choice theory: A critique of 

applications in political science. New Haven, CT: Yale University Press.

Grether, D., & Plott, C. R. (1979). Economic theory of choice and the preference reversal

 phenomenon. American Economic Review, 69, 623-638.

Grigorenko, E. L. (2003). Expertise and mental disabilities: Bridging the unbridgeable?

In

R. J. Sternberg & E. L. Grigorenko (Eds.), The psychology of abilities,

competencies, and expertise (156-185). Cambridge, UK: Cambridge University

Press.

Haidt, J. (2001). The emotional dog and its rational tail: A social intuitionist approach to

moral judgment. Psychological Review, 108(4), 814-834.

Halpern, D. F. (1998). Teaching critical thinking for transfer across domains. American

 Psychologist, 53(4), 449-455.

Higgins, E., T., & Kruglanski, A. W. (Eds.) (1996). Social psychology: Handbook of 

basic principles. New York: The Guilford Press.

Hofer, B. K., & Pintrich, P. R. (1997). The development of epistemological theories:

Beliefs about knowledge and knowing and their relation to learning. Review of 

 Educational Research, 67 (1), 88-140.

Hogarth, R. M., & Kunreuther, H. (1995). Decision making under ignorance: Arguing

with yourself. Journal of Risk and Uncertainty, 10, 15-36.

Holyoak, K. J. (1991). Symbolic connectionism: Toward third-generation theories of 

expertise. In K. A. Ericcson & j. Smith (Eds.), Toward a general theory of 

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 274/282

264

expertise: Prospects and limits (pp. 301-335). New York: Cambridge University

Press.

Homel, R. J. & Lawrence, J. A. (1992). Sentencer orientation and case details: An

interactive analysis. Law and Human Behavior, 16 (5), 509-537.

Johnson, E. J. (1988). Expertise and decision under uncertainty: Performance and

 process.

In M. T. H. Chi, R. Glaser & M. J. Farr (Eds.), The nature of expertise (209-228).

Hillsdale, NJ: Lawrence Erlbaum Associates.

Kahneman, D., & Lovallo, D. (1993). Timid choices and bold forecasts: A cognitive

 perspective on risk taking. In D. Kahneman & A. Tversky (Eds.) (2000), Choices,

values, and frames (pp. 393-413). Cambridge, UK: Cambridge University Press.

Kahneman, D., Slovic, P., & Tversky, A. (1982). Judgment under uncertainty: Heuristics

and biases. Cambridge, UK: Cambridge University Press.

Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under 

risk. In D. Kahneman & A. Tversky (Eds.) (2000), Choices, values, and frames

(pp. 17-43). Cambridge, UK: Cambridge University Press.

Kahneman, D., & Tversky, A. (1984). Choices, values, and frames. In D. Kahneman &

A. Tversky (Eds.) (2000), Choices, values, and frames (pp. 1-16). Cambridge,

UK: Cambridge University Press.

Kahneman, D., & Tversky, A. (Eds.) (2000). Choices, values, and frames. Cambridge,

UK: Cambridge University Press.

Katzner, D. W. (1992). Operationality in the Shackle-Vickers approach to decision

making in ignorance. Journal of Post Keynesian Economics, 15(2), 229-254.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 275/282

265

Kelsey, D. (1994). Maxmin expected utility and weight of evidence. Oxford Economic

 Papers, 46 , 425-444.

Kelsey, D., & Quiggin, J. (1992). Theories of choice under ignorance and uncertainty.

   Journal of Economic Surveys, 6 (2), 133-153.

Kirkpatrick, L., & Epstein, S. (1992). Cognitive-experiential self-theory and subjective

 probability: Further evidence for two conceptual systems. Journal of Personality

and Social Psychology, 63(4), 534-544.

Klaczynski, P. A. (1997). Bias in adolescents’ everyday reasoning and its relationship

with intellectual ability, personal theories, and self-serving motivation.

 Developmental Psychology, 33(2), 273-283.

Klaczynski, P. A., & Gordon, D. H. (1996). Everyday statistical reasoning during

adolescence and young adulthood: Motivational, general ability, and

developmental influences. Child Development, 67 , 2873-2891.

Klaczynski, P. A., & Narasimhan, G. (1998). Development of scientific reasoning biases:

Cognitive versus ego-protective explanations. Developmental Psychology, 34(1),

175-187.

Krampe, R. T., & Baltes, P. B. (2003). Intelligence as adaptive resource development and

resource allocation: A new look through the lenses of SOC and expertise. In R. J.

Sternberg & E. L. Grigorenko (Eds.), The psychology of abilities, competencies,

and expertise (31-69). Cambridge, UK: Cambridge University Press.

Krantz, D. H. (1991). From indices to mappings: The representational approach to

measurement. In D. Brown & J. Smith (Eds.), Frontiers of mathematical 

 psychology (pp. 1-52). New York: Springer-Verlag.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 276/282

266

Kravchuk, R. S. (1989). A footnote to cost-benefit analysis applied under conditions of 

radical ignorance. Policy Studies Journal, 18(2), 325-341.

Kuhn, D. (1991). The skills of argument . Cambridge, UK: The Cambridge University

Press.

Kuhn, D., Weinstock, M., & Flaton, R. (1994). How well do jurors reason? Competence

dimensions of individual variation in a juror reasoning task. Psychological 

Science, 5(5), 289-296.

Lau, R. R., & Redlawsk, D. P. (2001). Advantages and disadvantages of cognitive

heuristics in political decision making. American Journal of Political Science,

45(4), 951-971.

Lavine, H. (2002). On-line versus memory-based process models of political evaluation.

In K. R. Monroe (Ed.), Political psychology (pp. 193-205). Mahwah, NJ:

Lawrence Erlbaum Associates.

Lazarus, R. (1994). Appraisal: The long and short of it. In P. Elkman & R. J. Davidson

(Eds.), The nature of emotion: Fundamental questions (pp. 208-215). New York:

Oxford University Press.

Lipman, M. (1995). Moral education, higher-order thinking and philosophy for children.

 Early Child Development and Care, 107 , 61-70.

Lodge, M. (1995). Toward a procedural model of candidate evaluation. In M. Lodge &

K. M. McGraw (Eds.), Political judgment: Structure and process (111-140). Ann

Arbor, MI: University of Michigan Press.

Lodge, M., & Stroh, P. (1993). Inside the mental voting booth: An impression-driven

 process model of candidate evaluation. In S. Iyengar & W. J. McGuire (Eds.),

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 277/282

267

 Explorations in political psychology (225-263). Durham, NC: Duke University

Press.

Loewenstein, G. F., Weber, E. U., Hsee, C. K., & Welch, N. (2001). Risk as feelings.

 Psychological Bulletin, 127 (2), 267-286.

Lupia, A., McCubbins, M. D., & Popkin, S. L. (2000a). Beyond rationality: Reason and

the study of politics. In A. Lupia, M. D. McCubbins, & S. L. Popkin (Eds.),

 Elements of reason: Cognition, choice and the bounds of rationality (1-20).

Cambridge, UK: Cambridge University Press.

Lupia, A., McCubbins, M. D., & Popkin, S. L. (2000b).  Elements of reason: Cognition,

choice and the bounds of rationality. Cambridge, UK: Cambridge University

Press.

Lupia, A. (1994). Shortcuts versus encyclopedias: Information and voting behavior in

California insurance reform elections. American Political Science Review, 88(1),

63-76.

Madison, J., Hamilton, A., & Jay, J. (1788). The federalist papers.

Marcus, G. E., Neuman, W. R., & Mackuen, M. (2000). Affective intelligence and 

 political judgment . Chicago, IL: The Chicago University Press.

McCarthy, C. L. (1996). What is “critical thinking”? Is it generalizable? Educational 

Theory 46 (2), 217-239.

McGraw, K. M., & Steenbergen, M. (1995). Pictures in the head: Memory

representations of political candidates. In M. Lodge & K. M. McGraw (Eds.),

 Political judgment: Structure and process (15-42). Ann Arbor, MI: University of 

Michigan Press.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 278/282

268

Means, M. L., & Voss, J. F. (1996). Who reasons well? Two studies of informal

reasoning among children of different grade, ability, and knowledge levels.

Cognition and Instruction, 14(2), 139-178.

Minsky, M. (1997). Negative expertise. In P. J. Feltovich, K. M. Ford & R. R. Hoffman

(Eds.), Expertise in context: Human and machine (515-521). Menlo Park, CA:

AAAI Press/The MIT Press.

Morling, B., & Epstein, S. (1997). Compromises produced by the dialectic between self-

verification and self-enhancement. Journal of Personality and Social Psychology,

73(6), 1268-1283.

Murphy, S. T., & Zajonc, R. B. (1993). Affect, cognition, and awareness: Affective

 priming with optimal and suboptimal stimulus exposures. Journal of Personality

and Social Psychology, 64(5), 723-739.

 Nehring, K. (2000). A theory of rational choice under ignorance. Theory and Decision,

48, 205-240.

 Nisbett, R. E., & Wilson, T. D. (1977). Telling more than we can know: Verbal reports

on mental process. Psychological Review, 84(3), 231-259.

Paul, R. (1993). Critical thinking: What every person needs to survive in a rapidly

changing world (3rd ed.). Santa Rosa, CA: Foundation for Critical Thinking.

Perkins, D. N., Farady, M., & Bushey, B. (1991). Everyday reasoning and the roots of 

intelligence. In J. F. Voss, D. N. Perkins, & J. W. Segal (Eds.), Informal 

reasoning and education (83-105). Hillsdale, NJ: Lawrence Erlbaum Associates.

Perry, W. G. (1970). Forms of intellectual and ethical development in the college years:

 A scheme. New York: Holt, Rinehart, and Winston.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 279/282

269

Peters, E., & Slovic, P. (1996). The role of affect and worldviews as orienting

dispositions in the perception and acceptance of nuclear power. Journal of 

 Applied Social Psychology, 26 (16), 1427-1453.

Peters, E., & Slovic, P. (2000). The springs of action: Affective and analytical

information processing in choice. Personality and Social Psychology Bulletin,

26 (12), 1465-1475.

Pohl, R. F., & Hell, W. (1996). No reduction in hindsight bias after complete information

and repeated testing. Organizational Behavior and Human Decision Processes,

67 (1), 49-58.

Rahn, W. (1995). Candidate evaluation in complex information environments: Cognitive

organization and comparison processes. In M. Lodge & K. M. McGraw (Eds.),

 Political judgment: Structure and process (43-64). Ann Arbor, MI: University of 

Michigan Press.

Sears, D. O. (1993). Symbolic politics: A socio-psychological theory. In S. Iyengar & W.

J. McGuire (Eds.), Explorations in political psychology (113-149) . Durham, NC:

Duke University Press.

Seifert, C. M., Patalano, A. L., Hammond, K. J., & Converse, T. M. (1997). Experience

and expertise: The role of memory in planning for opportunities. In P. J.

Feltovich, K. M. Ford & R. R. Hoffman (Eds.), Expertise in context: Human and 

machine (101-123). Menlo Park, CA: AAAI Press/The MIT Press.

Shafir, E., Simonson, I., & Tversky, A. (1993). Reason-based choice. In D. Kahneman &

A. Tversky (Eds.) (2000), Choices, values, and frames (pp. 597-619). Cambridge,

UK: Cambridge University Press.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 280/282

270

Shanteau, J. (1992). How much information does an expert use? Is it relevant? Acta

 Psychologica, 81, 75-86.

Siegel, H. (1997). Rationality Redeemed? New York: Routedge.

Sloman, S. A. (1996). The empirical case for two systems of reasoning. Psychological 

 Bulletin, 119(1), 3-22.

Slovic, P. (1991). The construction of preference. In D. Kahneman & A. Tversky (Eds.)

(2000), Choices, values, and frames (pp. 489-502). Cambridge, UK: Cambridge

University Press.

Slovic, P., Finucane, M., Peters, E., & MacGregor, D. G. (2002). The affect heuristic. In

T. Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristics and biases: The

 psychology of intuitive judgment (pp. 397-420). Cambridge, UK: Cambridge

University Press.

Sniderman, P. M., Brody, R. A., & Tetlock, P. E. (1991). Reasoning and choice:

 Explorations in political psychology. Cambridge, UK: Cambridge University

Press.

Stanovich, K. E., & West, R. F. (2000). Individual differences in reasoning: Implications

for the rationality debate. Behavorial and Brain Sciences, 23, 645-726.

Sternberg, R. J. (1997). Cognitive conceptions of expertise. In P. J. Feltovich, K. M. Ford

& R. R. Hoffman (Eds.), Expertise in context: Human and machine (149-162).

Menlo Park, CA: AAAI Press/The MIT Press.

Tesser, A., & Martin, L. (1996). The psychology of evaluation. In E. T. Higgins & A. W.

Kruglanski (Eds.), Social psychology: Handbook of basic principles (400-432).

 New York: The Guilford Press.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 281/282

271

Tetlock, P. E. (1994). The Slavery Debate in Antebellum America: Cognitive Style,

Value Conflicts, and the Limits of Compromise. Journal of Personality and 

Social Psychology, 66(1), 115-.

Todd, P. M., & Gigerenzer, G. (2000). Précis of Simple heuristics that make us smart .

 Behavioral and Brain Sciences, 23, 727-780.

Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The

conjunction fallacy in probability judgment. In Gilovich, T., Griffin, D., &

Kahneman, D. (Eds.) (2002), Heuristics and biases: The psychology of intuitive

 judgment. Cambridge, UK: Cambridge University Press.

Tversky, A., & Kahneman, D. (1992). Advances in prospect theory: Cumulative

representation of uncertainty. In D. Kahneman & A. Tversky (Eds.) (2000),

Choices, values, and frames (pp. 44-65). Cambridge, UK: Cambridge University

Press.

Voss, J. F., Lawrence, J. A., & Engle, R. A. (1991). From representation to decision: An

analysis of problem solving in international relations. In R. J. Sternberg & P. A.

Frensch (Eds.), Complex problem solving: Principles and mechanisms (pp. 119-

158). Hillsdale, NJ: Lawrence Erlbaum Associates.

Voss, J. F., Perkins, D. N., & Segal, J. W. (Eds.) (1991). Informal reasoning and 

education. Hillsdale, NJ: Lawrence Erlbaum Associates.

Voss, J. F., & Post, T. A. (1988). On the solving of ill-structured problems. In M. T. H.

Chi, R. Glaser, & M. J. Farr (Eds.), The nature of expertise (261-285). Hillsdale,

 NJ: Lawrence Erlbaum Associates.

7/29/2019 umi-umd-2199.pdf

http://slidepdf.com/reader/full/umi-umd-2199pdf 282/282

272


Recommended