Revisiting ‘Weinberg’s Choice’: Classic Tensionsin the Concept of Scientific Merit
Tomas Hellstrom • Merle Jacob
Published online: 6 September 2012
� Springer Science+Business Media B.V. 2012
Abstract Alvin Weinberg’s classic and much debated two articles in Minerva,
‘‘Criteria for Scientific Choice’’ (1963) and ‘‘Criteria for Scientific Choice II – The
Two Cultures’’ (1964), represent two of the first and most important attempts to
create a meta-discourse about priority setting in science policy, and many of the
points advanced remain relevant. The goal of this paper is to elaborate on the
relevance of some of Weinberg’s original arguments to priority setting today. We
have singled out four issues for attention: The tension between scientific and
institutional choice, the assumptions behind the triad of scientific, technological and
social merit, the elusive ‘externality from size’ argument for funding promoted by
Weinberg, and finally the problems involved in the idea of basic science as an
‘overhead cost’ for applied science, and applied science as an ‘overhead’ on a
sectoral mission. These four issues will be elaborated from a policy perspective and
connected to present day challenges for science and technology policy.
Keywords Science and technology policy � Scientific choice � Applied science �Big Science � Relevance
The main objective of science policy is the allocation of resources among different
fields of science, and the creation of governance instruments that can steer science
towards goals which would ultimately benefit the common weal. Once stated, these
objectives appear to be not only obvious but fairly simple. Resource allocation is
after all a central activity in policymaking, and STI policy is no exception to this
T. Hellstrom
Lund University School of Economics and Management, PO Box 7080, 220 07 Lund, Sweden
e-mail: [email protected]
M. Jacob (&)
Research Policy Institute, Lund University, PO Box 117, 221 00 Lund, Sweden
e-mail: [email protected]
123
Minerva (2012) 50:381–396
DOI 10.1007/s11024-012-9203-9
rule. Resource allocation and governance are inextricably linked in all policy areas.
However, this linkage renders science policy a very complex area because of the
radical uncertainties associated with knowledge production and the opacity that
characterizes the scientific endeavor from a policy perspective.
A key concern for any public investment is that of valued goals and outcomes,
and the reduction of uncertainty regarding the results of public expenditures. The
inclusion of US federal research funding under the Government Performance and
Results Act (GPRA) is an example of a family of outcome oriented governance
mechanisms which illustrate this point (Cozzens 1997, 2007). The emergence of
National Foresight Exercises in the 1980s and 1990s are examples of formative or
input type measures utilized in priority setting (cf. Irvine and Martin 1984; Martin
1995). Brooks (1978) argued that priority setting was necessary because society’s
expectations may rise faster than the amount of resources available for science, that
there are more opportunities than means, and that as a result there has developed ‘‘a
feeling that the cost from scientific progress arising from wrong choices of scientific
direction may be considerably higher than in the past’’ (171). Regardless of the
rationale or the methods used for ‘picking the winners’ from fields of science, policy
choice is in the end a matter of resolving the social and institutional concerns
embedded in the governance of science as a whole. Mapping and selecting among
research priorities is only one important type of strategic choice in science policy.
Another, perhaps more important, is the choice of institutional conditions for
science. Further, as the combinatory possibilities of choice increase, it also becomes
imperative to be able to consciously choose and argue for a framework for choice
itself. This, of course, implies several more possibilities for contestation. In this
article, we will (re)introduce, explicate and discuss one such a framework, namely
that which Alvin Weinberg introduced in Minerva in the 1960s (Weinberg 1963,
1964). Our main interest is to revisit the issue of scientific choice as a problem in
contemporary science and technology policy by employing a reconstructive reading
of Weinberg’s two articles on scientific choice.
The Institutional Background
Four and a half decades ago, Weinberg, then director of Oak Ridge National
Laboratory (ORNL), outlined a proposal for a rational approach to priority setting in
science policy which integrates choice of science with that of governance. His
proposal in brief was that the argument for funding science needed to be viewed
against the backdrop of other pressing social and economic needs rather than as an
end in itself and that it involved institutional choices for science. While he
suggested that science benefits from becoming integrated with ‘normal’ policy, and
being subjected to clear expectations, he was also a covert defender of basic, non-
directed science. True to his time and context, Weinberg saw the possibility for
combining basic research and policy relevance within the organizational structure of
the national laboratory. Big Science conducted in the institutional setting of national
laboratories could address social and policy needs, as well as stimulate basic
research, more or less relevant to such needs. This being said, it is important to note
382 T. Hellstrom, M. Jacob
123
that the instrumentalities of the Cold War made basic scientific inquiry and policy
needs in many ways coextensive. The focus of the ORNL was reactors. However,
the biology section was the largest – especially that studying the impact of radiation
on living material (cf. Johnson and Schaffer 1994). Both fields created plenty of
opportunities for basic and cutting edge research into classical problems in physics
and biology.
Taken together, Weinberg’s key publications in Minerva (Weinberg 1963, 1964)
may be read as a recipe for how to introduce and fund large scale basic research
through the ‘Trojan horse’ of national priorities.1 In order to achieve this, science
policy must deal with specific questions in specific ways. Weinberg posed two sets
of related questions, the first of which was: ‘How can the funding of a scientific field
or program be motivated as a social choice?’ Specifically, given that all choice with
regard to science is socially situated, can there still be a ‘hierarchy of choice’ that
can guide policymakers? And secondly - ‘how should the issue of institutional
choice be addressed within science funding?’ Funding science is also about
selecting desirable organizational forms for science funding and for doing science.
How can these be motivated?
Several decades after the publication of Weinberg’s two seminal articles in
Minerva and the debate ensuing from that proposal (cf. Moravcsik 1973; Ziman
2000), priority setting remains a point of contention between science and policy. It
is therefore not surprising that since Weinberg, several other proposals have been
made, most if not all in the spirit of Weinberg’s proposal. The finalization thesis
(Schafer and Burgess 1983) and the Mode 2/socially robust knowledge production
thesis (Gibbons et al. 1994; Nowotny et al. 2001) are among the most notable and
they both received responses from the academic community similar to that which
Weinberg’s proposal met. Science policy has more or less embraced Weinberg’s
position on publicly funded science since the late 1980s. The dominant ideology
among policymakers at least within the European Union and OECD member states
is that funding science must be motivated in terms of perceived benefits to the
common weal, and particularly to economic growth (OECD 2004; EU 2007). So
commonplace has this position become that among its main proponents, the issue of
difference appears to be what type of time frame should be set on science’s
‘payback’ to its funders and which of the broadly agreed on technological growth
areas are the most relevant at the time of choice.
Despite the similarities in spirit, there is much of contemporary policy that runs
counter to Weinberg’s ideas. For example, the current trend of investing in areas
such as nanotechnology, ICTs and renewable energy technologies reflects Wein-
berg’s position that social priorities may be used as a strategic tool for steering
research and development. However, the similarities to Weinberg’s doctrine vanish
when one turns to the issue of institutional choice. Weinberg’s model recommends
large research institutes as the institutional choice. Current policy, however, tends to
1 We refer mainly to Weinberg’s collection of articles, including the two mentioned here, from 1967,
Reflections on Big Science, Cambridge Mass: MIT Press.
Revisiting ‘Weinberg’s Choice’ 383
123
favor heterogeneous networks of fairly autonomous actors funded through
competitive grants. This approach differs in another respect from Weinberg in
that it eschews the notion of a ‘safe haven’ for basic research which was at the root
of Weinberg’s preference for large national government labs. Further, Weinberg’s
preoccupation with strategic choices did not extend to ‘economic competitiveness’
(Ziman 2000). To be fair, however, this reading must be contextualized with the
understanding that Weinberg was addressing a largely US audience and in this
setting publicly funded science was historically mission-oriented and administered
by federal agencies who were commissioning research for their own purposes. In
effect, the notion of science policy as a coherent policy area was not the historical
reference point for Weinberg as it had been for Vannevar Bush (cf. Brooks 1967).
Both Bush and Weinberg saw the possibilities of ‘peaceful coexistence’ between
scientists doing basic science and sectoral paymasters such as the military pursuing their
ends through science. Bush considered military involvement in science a manageable
risk for scientists wanting to do basic science: one that should be traded against their
national obligation (Hamblin 2002). Overt science-military/application relationships
appeared to him a necessary evil in some cases, and scientists, through some measured
boundary work, could persuade for instance the military to support both basic inquiry
and technical application. Weinberg saw the relationship in a different way. He believed
administration and scientific practice, at least as far as Big Science was concerned, to be
a one-package deal, and while the value underlying the practice of science was always
truth, ‘‘administrative choices must be made to guide the scientist […] as to which truth
to seek’’ (Weinberg 1986: 198). The metaphor of application being a Trojan horse for the
scientist servicing the aims of science thus fills two quite different purposes in the Bush
and Weinberg science policy doctrines. The Bush doctrine is well known. It is tied to the
‘first phase of science policy’ variously labeled depending on periodization, e.g. ‘the
cold war period’ (Brooks 1986), ‘supply-side research economics’ (Freeman 1987) or
what Salomon (1987) simply refers to as ‘the first phase.’ This first phase is characterized
by strong faith in science, military security concerns and some early institution building
for science funding. Weinberg’s call for explicit priority setting is best understood in
terms of the second and third phases (1960s–1970s) when military and socio-economic
development started to drive science policy and the notion of social utility from science
started to grow in complexity.
The Piagnol Report (OECD 1963) kicked off the new era and laid the foundations
for calculating expenditures on basic science, applied science and technological
development. Policy awareness of R&D expenditures went hand in hand with a self-
aware or formalized prioritization regime. This, in turn, produced a new variant of
the two-cultures tension, that between bureaucracy and research (Elzinga and
Jamison 1995). Examples of this include the 1960s Minerva debate which involved
several notable science administrators and scientists: e.g. Michael Polanyi on the
side of ‘free, self-regulating science’ and Weinberg on ‘the other side.’ Another
example was the inclusion of the special news and comments section in the journal
Science authored by Daniel Greenberg.
The 1971 Brooks report presaged the emergence of what can be identified as today’s
innovation-oriented doctrine, with the establishment of new science policy notions, such
as mission orientation, technology policy and social relevance (OECD 1971).
384 T. Hellstrom, M. Jacob
123
Contemporary policy shares many similarities with Brooks’ 1971 position on utility but
emphasizes economic rather than social development, industrial rather than military
interests, and, as mentioned earlier, prefers short- to medium-term funding arrange-
ments rather than bloc allocations to public research providers. The aforementioned shift
in focus from social to economic development may be read as a response to geopolitical
developments such as the oil crises, the emergence of the newly industrialized countries
(NICs) and industrial competition from Japan. These considerations triggered a science
policy emphasis on economic (commercial) competition through industrial innovation,
typically deriving from a national capacity to innovate around core (platform)
technologies and integrate technological forecasting into science policy.
This implied a decentralizing movement towards industrial ecologies, commercial
spin-outs from universities and a more socially distributed human capital approach
(aka ‘Knowledge Society’) rather than a purely technological focus. Governments and
international organizations, however, continue to pursue the ideal of setting national
goals for science and technology, identifying priority areas and investing selectively in
what they believe to be the conditions for next generation economic growth. The
emphasis on priority setting has intensified since Weinberg’s time with a clear
genealogy from the Bromley Report in 1971 on the academic, socio-economic and
military prospects for Physics sub-fields (Bromley 1971) to the foresight exercises of
the 1980s (Martin and Irvine 1989) and further to the theme selection within the EU
Framework Programs and similar activities in the US (e.g. the US Critical
Technologies Program from 1989 to 1998), in the UK, Japan, France and elsewhere.
As a case in point, apart from supporting industry innovation in selected areas, the
Framework Programs (1–7) also elaborated criteria for developing critical mass on a
European level and creating other forms of European value added (and from FP6 also
the structuring and extension of the European Research Area). Themes were typically
derived from major EU policy areas such as energy, health, agriculture, etc. There have
been observations to the effect that the FPs very rarely deprioritize any area from the
previous period in terms of funding, but rather expand the overall budgets (Georghiou
and Harper 2008). This might be a result of a lack of operational criteria and meta-
criteria for choice (specific topics are mainly elicited bottom-up). Overall, the criteria
to identify thematic domains in FP7 were reminiscent of previous ones identified in the
Bromley Report of 1971, namely economic growth (‘contribution to European policy
objectives’), possibility to perform excellent research in the area (‘European research
potential’), and the creation of non-duplicative critical mass and complementarity in
terms of industry initiatives (‘European value added’). The recent addition of Joint
Programming Initiatives is the latest effort on the part of the EU to promote a European
Research Area through coordinating priority setting among member states.
The Present Argument
Though some of Weinberg’s framework and original arguments are relevant also to
this new setting, it is the ambition of the present article to explicate those aspects of
his work that might provide a road back, and forward, to a science policy as opposed
to simply industrial or business policy. Such a policy doctrine would include lessons
Revisiting ‘Weinberg’s Choice’ 385
123
learnt from previous periods in terms of setting priorities for specific fields, but also
retrace the first science policy period’s attention to policy as a means of stimulating
and facilitating the development of fields of science qua projects of understanding.
In addition, such a retracing exercise would allow the science policy analyst to
recognize in science policy the choices of social forms for science, and their
relevance, by offering a valuation framework for the institutional mechanisms used
in priority setting today.
In the light of this we propose to revisit Weinberg’s classic arguments on
scientific choice along the two main dimensions referred to above. These are utility
as a criterion for choice (i.e. how public funding of science can be motivated on
grounds of utility) and, secondly, the choice of institutional mechanism (i.e. the
organizational setting through which science should be funded and conducted).
These two dimensions will be dealt with in turn, with an aim to elaborate their
conceptual structure and practical implications, recognizing that we need to make
allowances for conceptual and normative ‘time travel’ (avoiding anachronism and
historicity) in order to address some pertinent issues of today. In this spirit, we will
outline two aspects of utility-driven choice viz. science funding as a ‘social
overhead’ on other areas of public funding (including science), and the ‘recursive’
nature of choice, where some research is funded on the basis of its value to other
fields. Secondly, we will outline Weinberg’s notions of institutional choice, in
particular two areas: firstly, the issue of network externalities and scale effects in
science and, secondly, the issue of choice of vehicles for setting priorities and
distributing funds. We will also demonstrate how these types of choice connect to
contemporary issues facing science policy. It is important from the outset to bear in
mind that Weinberg’s interest was primarily in the problems of funding Big Science
and it is in this spirit that we attempt to explicate and revisit his arguments.
Choice from Utility I: Science Funding as a Social Overhead
A notion running through Weinberg’s writing on scientific choice, particularly in his
second Minerva paper, ‘Criteria for Scientific Choice II’ (1964), is that of a choicehierarchy for science. This is where each investment in research is considered from
the point of view of its usefulness to some other more applied problem. The purpose
is to reduce the need to consider any investment in science as a blind or faith-based
‘speculation’ (a.k.a. endless frontier thinking) by always connecting it to a more
applied stage neighboring the current choice context. This, Weinberg argued,
implies looking at a field of inquiry as an ‘overhead’ on its neighboring more
applied fields: each choice situation should be ‘assessed upwards’ in terms of its
instrumental relationship to a more technically or socially useful level of problem-
solving. Through this reasoning, all of science can, in the end, be considered an
overhead investment in the pursuit of a sectoral/societal mission.
It is easiest to start this argument from the perspective of applied science. If any
field of applied science deserves funding, it is because it is considered a necessary
part of the problem-solving processes dedicated to a social goal. These problem-
solving processes, however, may be many more than just the ones we would call
386 T. Hellstrom, M. Jacob
123
‘scientific’; they involve practical problem-solving, including muddling-through
trial and error approaches, investments in already existing strategies and technol-
ogies for reaching the social goal, etc. The overhead of applied science on the
problem in question can therefore be viewed as a stake in a gamble that these other
non-scientific strategies can be effectively improved or replaced by something
derived from science. Because of this, the costs of a specific applied science
overhead must always be assessed in terms of whether other, more practical ‘non-
scientific’ strategies could achieve the goal within acceptable time and efficiency
constraints.
Reasoning from this perspective, the notion of a science budget is replaced by
that of a mission budget in which science represents a certain fraction. In
Weinberg’s model, this separation between the budgets for applied and other types
of science contributed to a type of accountability. Here is another issue on which
Weinberg and present science policymakers differ. The current thinking is that all
types of science should meet the same type of accountability criteria. With science
being part of a sectoral budget rather than a science budget, one has to first ask what
questions need answering from the perspective of the sectoral mission, and then
decide what percentage of the mission’s budget for science is likely to achieve some
relevant and adequate answers.
One contemporary version of the overhead theory of science funding, albeit
perhaps not representative of Weinberg’s original idea, can be traced to the Value
for Money thinking which was introduced into policy discourse in the wake of New
Public Management (e.g. Clark 2006). The Swedish Agency for National Innovation
Systems (VINNOVA) has attempted to use this approach as a partial post hoc
justification for R&D investments. However, calculations of this nature are rife with
problems and often lead to some whiggish, post hoc ergo propter hoc histories of
technological development (VINNOVA terminology for such narratives is ‘effect
studies’) in which national investments in a field of research are retrospectively
reinvented as having been strategically necessary for some technological break-
through (e.g. VINNOVA 2002).
Causal attributions of the science overhead to any specific technological/sectoral
effect are, and will probably remain, spurious even for retrospective analysis.
However, Weinberg’s claim was never this specific. His proposal for basic science
rests on three possible funding logics. One is as an overhead on applied science. Just
as applied science is an overhead on a mission, some basic science is needed to
make sure that applied science can be continuously successful. Another is that basic
science may be considered an overhead on science and technology overall, and be
legitimized as an activity necessary to maintain the institution of science. This
argument is a global version of the applied science utility of basic science argument.
In addition, basic science may be argued for on the basis of a cultural mission, and
be funded for reasons similar to that of state supported art. For reasons that should
now be obvious, Weinberg was drawn to the former of the two options. While he did
not deny the import of society maintaining a basic science activity set apart from
sectoral missions, he emphasized that by and large basic science is likely to be more
useful to applied science than most basic scientists would be willing to concede. In
addition, and to preempt an argument that will be further elaborated below, current
Revisiting ‘Weinberg’s Choice’ 387
123
basic science, Weinberg argued, often has a history as an applied science which has
become increasingly ‘obscure’ as its scientific core has ‘thickened’ and formalized.
In this way, many basic science disciplines would be already ‘semi applied’ without
being conscious about it. Certain areas of information science like signal processing,
as well as Weinberg’s own favorite radiation biology are cases in point.
Choice from Utility II: The ‘Recursive’ Nature of Scientific Merit
Weinberg outlined three external criteria of merit for supporting specific programs:
scientific, technological, and social merit. The first, ‘scientific merit,’ is not external
to science as such, but is treated as external in this instance because it requires that
the merit of a particular disciplinary project should be judged by researchers from
disciplines or research orientations outside of that of the research being considered.
Judging the relevance of a scientific program to other fields rests on the assumption
that high relevance in this regard indicates a scientific ‘platform’ or ‘fundamental’
quality of the program vis-a-vis progress in other fields. Scientific merit is also an
issue of whether a scientific field is ‘ripe for exploitation’ in terms of technology.2
Thus, scientific merit is founded on the value of the proposed research to the rest of
science and technology.
Technological merit turns the focus to applied science and technological
research, and is an issue of whether technological trajectories are exploitable or
ready to translate into practical solutions, on the one hand, and, on the other,
whether or not the human capital, or talent, currently exists for achieving those
outcomes. These two factors of merit can be said to be ‘internal’ to a field. In
addition, technological merit is an issue of whether or not the social goals to which
the given technology is to be applied are worthwhile. If they are, merit increases in
relation to the problem-solving capacity of the technology. This position resonates
in the present day platform technology program pursued to some extent within most
STI policy frameworks.
Finally, social merit is also defined ‘recursively’ as that which brings most value
to other social goals apart from the ones directly intended. Weinberg also
recognized that social value is notoriously difficult to explicate on a collective scale.
We will elaborate on this later. For now it is sufficient to note the recursive nature of
his system of merit: a subject/discipline is valued for its potential to enrich the rest
of science, technology for its potential to solve other problems in the technological
field, and social merit is awarded for its contribution to other social values. The
presumed relative autonomy of these fields of choice prevents Weinberg’s thinking
about priority setting and merit from lapsing into simple linearity. Nowhere did he
assume that scientific choice must act as primus motor in creating social value. He
maintained that differentiated science funding could only be an issue of how fast
one wants a field to develop, rather than whether it should develop at all, and the
question ‘how fast’ is an issue mired in social interests. The interlinking of the fields
is accomplished on the policy level by applying the ‘overhead theory’ for funding.
2 Cf. Bohme et al.’s finalization thesis (Bohme et al. 1976).
388 T. Hellstrom, M. Jacob
123
However, as we will see, all such choices will be circumscribed by the mix of
epistemic and social considerations that bears on choice of institutional formats.
Weinberg, being a representative of Big Science and mission-oriented research,
formulates a vision closer to a strategic technology/sectoral mission model of
priority. Weinberg (1965)3 presented a strategy for choice where the value of a field
would be judged on the combination of the social relevance of a particular problem,
and the ripeness for practical exploitation of a technological field related to this
problem. Thus, sectorally defined interest should be central to choice, but the
technological opportunities which mediate and facilitate what can be done in the
sectoral interest are equally important. Science, in terms of basic understanding, is
in the first instance not relevant to this issue. Technological aims, Weinberg argued,
are usually more closely scrutinized than scientific aims, from the point of view of
what is possible as well as with regard to what is socially attractive. Put simply,
more people have the ability to discuss issues of technological than of scientific
merit.
Given that policy for science is an endeavor in which the public has little
opportunity to engage, despite its interest in the practical outcomes of science, it is
remarkable how little reference Weinberg made to the political implications of
scientific choice. For instance, social merit criteria assume that the formal political
structure is able to reflect the widest set of social needs present in society. In this
sense, contemporary science policy’s introduction of stakeholder and user
intervention in agenda setting for science is an improvement on Weinberg’s social
criteria in so far as it allows room for input from civil society. The contemporary
version, however, is still hampered by the fact that the rationale behind stakeholder
involvement appear to be simply cost sharing and increased economic growth rather
than making available a social agenda for science based on democratic principles
(see, for example, Shove and Rip 2000). Thus, science policy still suffers from a
democratic deficit since civil society’s influence on agenda setting is limited to
those who can afford to pay for direct influence on science or those who represent
substantial economic-symbolic interests. In this respect, little has changed with the
discursive shift from science to STI policy. To the extent that politics figures in
discussions about science and/or innovation policy, it is usually assumed that the
only relevant political issues that pertains to science is that of which science will
best contribute to the national growth effort and what should be the balance between
that and the science which contributes to the development of science itself.
Institutional Choice I: Intellectual Density and Externality from Size
Weinberg proposed that the problem of choice in science policy would be made
more amenable to rational solutions if one distinguished between the scientific
fields, and the institutions to be supported as vehicles for developing those fields.
Although problems of institutional and scientific choice are inextricably linked,
policymakers have to separate the two for a number of reasons. One is that internal
3 Chapter 3 in Reflections.
Revisiting ‘Weinberg’s Choice’ 389
123
choice criteria can often only be operationalized by experts. While this means that
there is an aspect of science policy which is not transparent to the policymaker, it
does not imply that policymakers are unable to exercise influence in this sphere.
Thus, a key activity in STI policy is developing principles for stimulating and
organizing science accessible to policy. This involves formulating easy to
manipulate independent variables that bear on the mode and efficiency of
knowledge production. Two such principles discussed by Weinberg are intellectual
density and externality from size. Intellectual density refers to the amount of
academic activity that exists in a scientific discipline, such as the number of workers
engaged in research, degree of institutionalization (e.g. has the field managed to
acquire departments in all major universities, number of students, number of
journals, international meetings, etc.). The externality from size argument is that
once a field reaches a certain size, it is able to derive positive externalities such as
the creation of international networks, projects, student and senior faculty
exchanges.
These two qualities are associated in Weinberg’s thinking to the extent that one
of them is a consequence of the other. In the first place, Weinberg’s basic
assumption was that in order to achieve the kinds of economies of scale that the
modern scientific project promises, it ought to be exploiting every possibility for
agglomeration and focus, i.e. ways of keeping scientific work focused on a mission
while growing in volume. The argument is that as science grows, there is a natural
tendency towards fragmentation. Creating larger units within more coherent (read
mission-oriented) organizational structures prevents this tendency. Fragmentation
has many consequences. One of them is that it increases the time required to
uncover contradictions between areas of science which deal with the same set of
interconnected phenomena, but with different theoretical approaches and instru-
mentation. Communication and coordination can resolve such contradictions, but
that requires organizational solutions.
In the paper ‘But is the teacher also a citizen?’ published in Science in 1965 and
republished in Reflections, Weinberg asserted that all of science starts out from a
practical goal, for example the challenge to solve a properly external problem of
social relevance. However, often it is a problem-solving task related to another
scientific program – it is ‘externally motivated with a scientific origin.’ 4 Weinberg
further contended that such second-order branching out of problem-solving
activities risks creating scientific branches which are, in their own right, esoteric
and largely irrelevant– leading to what Weinberg following von Neumann refers to
as ‘baroque science’ (Weinberg 1965). Baroque science is a type of unnecessary
elaboration of complex detail, largely detached from the goals of related scientific
fields and only relevant to internal field discourses. Relevance to other fields of
science was one way of inhibiting the development of baroque science and
Weinberg argued that this is best achieved in a larger organization with a societal
4 This seems to be a mix between Larry Laudan’s internalist (1977) assertion that science progresses
through solving scientific problems, and the more externalist assumption that science rests, more or less
indirectly, on a foundation of real world problems (Hessen 1931). In addition to this well-known
distinction, Weinberg also has the ‘internal-external’ hybrid of a discipline trying to solve a problem
external to it but yet internal to science as such.
390 T. Hellstrom, M. Jacob
123
mission. Relevance to other scientific fields is a measure of quality in a field or
program, but it can also be an organizational principle for research management, the
implication being to create density and intellectual scale effects from organizing
into large units. According to Weinberg, ex ante project evaluations should take the
possibility of such effects into account, and many of today’s evaluation frameworks
do.
The second set of criteria related to size is of a more social character and
concerns human capital and social relations of national or regional import. In the
first case, Weinberg notes that there is a cost-benefit trade-off concerning outcomes
in large scientific programs that is very uncertain from the point of view of science
and/or society. With regard to purely scientific outcomes this was most aptly
demonstrated by the US Congress decision to abandon construction of the super-
conducting supercollider (SCSC) in 1993. The opportunity costs for science and
society of binding up large amounts of human capital in highly uncertain projects is
an argument against curiosity-driven mega-science. However, such organizations, if
managed correctly, may provide a platform for quickly mobilizing talent around
pertinent, long- or short-term social goals. Weinberg’s proffered solution to this
dilemma was to suggest the creation of relatively buffered national labs where risky
scientific and technological opportunities can be pursued in situations of social
need, and market failure. The challenge here is how to diversify such environments
to maintain a good balance between fields and between basic and applied projects.
On the social side, Weinberg noted that such environments and projects are
excellent platforms for international cooperation and for generating spillovers
across borders, that is to realize ‘internationalism’ through the vehicle of science.
The SCSC is again a case in point as much of its funds were in fact alternatively
invested in NASA’s contribution to the International Space Station Program (ISS).
Another example of overt internationalism is ITER (originally an acronym for the
International Thermonuclear Experimental Reactor), a 30 year, €16 billion project
originating from the Euratom cooperation and which is currently being built in
Southern France. ITER aims to develop fusion technology for energy production in
addition to aiding basic and applied research in nuclear/plasma physics. At the time
of writing, it involves seven parties (including the EU, which is host, the US and
China) and is a typical example of a project enabled by considerations of
international relations as well as pressing societal and basic research problems. The
European Spallation Source (ESS) located in Southern Sweden and involving more
than 15 partner countries is a similar initiative. The neutron research which will be
made possible by the ESS is expected to contribute to science and technological
development in areas as diverse as ICTs, transport, biotech and health.
Internationalism, size externalities, and critical mass are the arguments used to
motivate both these initiatives in technical as well as social terms. In both cases,
however, criticism has focused on exactly these issues, arguing that the concen-
tration of resources is a high risk strategy for science and the environment (see
Jacob and Hallonsten 2012). The way scientific values are to be translated into
public values is essentially a political negotiation process and the ESS case in
particular illustrates how the social value discourse has been conducted as an
Revisiting ‘Weinberg’s Choice’ 391
123
intrinsic part of these programs since their inceptions, rather than floating aloof from
public interests, as is more often the case in basic science-driven programs.
Institutional Choice II: Vehicles for Research and Mechanisms for Funding
A second type of institutional choice relates to the organizational platforms for
conducting scientific research, that is what environments to support, and what
funding mechanisms to select. Scientific and institutional choice are clearly related;
for example, the capacity of a given scientific group is likely to depend on several
institutional conditions apart from academic excellence, e.g. the funding structure,
infrastructure availability, likelihood of future complementary revenue streams
(which partly depends on other policies), availability and distribution of other
complementary resources. Choice also concerns the institutions which will govern
the disbursement of funds, that is what department, council or funding/evaluation
mechanism will be employed. Here the concern will be on the relation between
funding mechanisms and governance of a given scientific field. All types of science
are not furthered in the same way, and preferred outcomes are not only related to
what this or that field is today or what a given stakeholder may want it to become in
the future (Schmoch and Schubert 2009).
Although at first sight intuitive, Weinberg’s proposed division between science
and its institutions suggests the possibility of a policy impasse. If, for example,
institutional choice is taken to refer to the institutions that are expected to deliver the
scientific output, then there may be a conflict regarding what is perceived as optimal
for science (e.g. academic excellence) and what is perceived optimal for policy (e.g.
institutional distribution of resources in a research system). A concrete illustration
of this problem may be found in the choice between local excellence and a broad
distribution of research competence nationally. One of the core goals of science
policy in Nordic social democracies has been to create the possibility for scientific
institutions to reach critical mass and high quality delivery regardless of their
location. In recent years, a parallel policy objective has emerged, namely for some
of these institutions to achieve world class excellence. This implies that the
policymaker not only has to ensure the availability of resources in terms of
infrastructure and other resources to promote excellence at the national level but
also that some national institutions would be ahead or at least equal to the most
outstanding institutions in the field internationally. Given resource constraints and
the size of the country, this additional requirement implies that the goal of widest
geographic coverage would at least in some case have to be sacrificed to achieve the
required critical mass of resources to sustain world class excellence.
Thus, two possible interpretations of institutional choice in science can be shown
to create at least two different sets of dependencies between the scientific and the
institutional aspect of policy choice, and it is important to consider both. Weinberg
recognized many of these issues, however, he did not distinguish or relate them in
any systematic way, but rather referred to them as the ‘‘sum of separate decisions
which determines the policy as a whole’’ (Weinberg 1965: 66), and then goes on to
focus on priority setting between branches of science. This is interesting, since the
392 T. Hellstrom, M. Jacob
123
‘scale of values’ suggested by Weinberg to apply to such choices seems to be most
aptly conceived when branches of science are viewed in their institutional context.
The research producing institution/organization strongly influences the shape of its
research processes and types of output, that is whether science is basic, applied or
strategic, and the funding institution will affect the form and type of scientific
outputs as well as how they may be put to use.
One way of addressing these aspects together is for policymakers to be aware of
the broader epistemic decisions implied in choosing one set of institutions over
another. One example taken from Swedish science policy may serve to illustrate.
Sweden has a public R&D structure in which universities dominate; there are few
other national public R&D providers that can compete convincingly with these
institutions. Naturally, with the shift from science to STI policy, some policymakers
have considered the possibility that their policy objectives may be better realized in
a more institutionally diverse R&D system. This has led to several calls for the
broadening of the already very limited research institute sector. The implications of
shifting funds in this way would amount to a policy choice between research quality
and short term industrial relevance, but more problematically it would also contain
assumptions about what type of knowledge best underpins what type of
development.
Discussion and Conclusions
Although priority setting and institutional choice are central problems in STI policy,
they remain highly contested issues in the dialogue between science and policy.
Proposals such as Weinberg’s have as a result traditionally met with fierce
resistance from the scientific community. One reason for the friction is the tension
that exists between the view that science should be accountable to the public and the
understanding that the diversity and special nature of science makes it unlikely that
any one formula for governance would be adequate. Finding a model for priority
setting and institutional choice in science that would balance these criteria to the
liking of both policymakers and scientists is akin to finding the Archimedean point
of science policy. In this article, we have revisited Weinberg’s proposal for handling
this problem and juxtaposed it with a few contemporary solutions.
As mentioned earlier, one may read Weinberg’s approach to priority setting in
some of contemporary science policy moments. One such is the grand challenge
approach to funding science which combines Weinberg’s recommendation about
focusing on societal relevance with a critical mass approach. Although the
institutional choice issues differ from Weinberg’s in so far as universities and not
large scale research institutes are the sites where grand challenge research is located,
the issues of scale remain key. Contemporary science policy has, however, access to
technologies which make inter- and intranational network arrangements possible
that were not available to Weinberg.
Although Weinberg was keen on steering science towards social goals, he was
careful to point out the need for investment in basic science. In fact, Weinberg’s
primary lesson seems to be one of heterogeneity both at the level of priority setting
Revisiting ‘Weinberg’s Choice’ 393
123
and types of investments. This is an essential point of difference with contemporary
science policy where nations seem to be converging on similar types of priorities
and instruments for promoting them. Part of this convergence may be related to the
need to collaborate across borders to deal with grand challenges such as climate
change. Another has to do with the increased benchmarking of public R&D across
nations as a result of the widespread adoption of new public management.
Weinberg devoted a considerable amount of time in his model to the issue of how
investment in basic science could be motivated. On the one hand, he was very wary
of basic science becoming decoupled from any type of relevance criteria. On the
other hand, he saw funding to basic science as a cost which society incurred in order
to get the utility that science delivered in other respects, hence the social overhead
argument. Weinberg believed that the danger of baroque science could be avoided
through the application of external merit criteria, i.e. relevance to fields outside of
the field which is proposing the research, potential exploitability and social
relevance. Although not as explicit, contemporary policy documents do echo
Weinberg’s fears about the risk of basic science developing a trajectory which is
decoupled from socioeconomic concerns. This does not, however, translate into an
effort to design governance measures that are particular for basic science as it does
in Weinberg’s case. Many countries choose the alternative strategy of applying the
same performance criteria to all types of research. One example is the EU member
countries’ efforts to increase the social contribution of science which are conducted
primarily but not exclusively via mandating universities to engage in third stream
activities. This is a more invasive version of Weinberg’s proposal since it includes
the educational activities of the university and it appears to treat all institutions of
higher education and research as being equally adaptable to the goals of mission-
oriented science, in spite of the popular policy argument that higher education and
research institutions should profile themselves.
Contemporary science policy would certainly benefit from revisiting Weinberg’s
approach on the question of institutional choice. He assumed that science would
continue to be conducted in a wide range of different institutional settings and that
this diversity was in part motivated by the nature of the mission. Institutional
heterogeneity remains a central property of the US higher education and research
sector. However, despite the continued fascination of European science policy-
makers with US science policy, their attempts to emulate this system tend to stop
short of promoting heterogeneity. In addition, the issue of institutional choice in
contemporary STI policy in Europe is complicated by the fact that block funding
has lost currency to project and program funding. The popularity of project and
program funding modalities reduces the possibility for institutional diversity.
Finally, one of the little discussed problems with Weinberg’s proposal about
treating basic science as ‘an overhead’ cost for applied science, and applied science
as an ‘overhead’ on a sectoral mission is that it would require either substantial
input from science itself in the politics of resource allocation or that those charged
with this task in the political community have considerable knowledge of science
itself. Oddly enough, it is increasingly becoming clear that contemporary STI policy
requires a similar expertise in priority setting. This call for more expertise, however,
394 T. Hellstrom, M. Jacob
123
has to be balanced against the fact that STI policy continues to run a democratic
deficit despite its rhetoric of social accountability and user involvement.
References
Bromley, D. A. (Chairman Physics Survey Committee) 1971. Physics in perspective. Washington, DC:
National Research Council/National Academy of Sciences.
Brooks, Harvey. 1967. Science and the allocation of resources. The American Psychologist 22: 87–201.
Brooks, Harvey. 1978. The problem of research priorities. Daedalus 107(2): 171–190.
Brooks, Harvey. 1986. National science policy and technological innovation. In The positive sumstrategy: Harnessing technology for national growth, eds. R. Landau, and N. Rosenberg.
Washington, DC: National Academy of Sciences.
Bohme, Gernot, Wolfgang van den Daele, and Wolfgang Krohn. 1976. Finalization in science. SocialScience Information 15(2–3): 307–330.
Clark, B. 2006. Value for money in science. Economic Affairs 9(2): 30–34.
Cozzens, Susan. 1997. The knowledge pool: Measurement challenges in evaluating fundamental research
programs. Evaluation and Program Planning 20: 77–89.
Cozzens, Susan. 2007. Death by peer review? The impact of results-oriented management in U.S.
research. In The Changing Governance of the Sciences, eds. Richard Whitley, and Jochen Glaser,
225–242. Dordrecht: Springer.
Darwin Hamblin, J. 2002. The Navy’s ‘sophisticated’ pursuit of science: Undersea warfare, the limits of
internationalism, and the utility of basic research 1945–1956. Isis 93(1): 1–27.
Elzinga, Aant, and Andrew Jamison. 1995. Changing policy agendas in science and technology. In
Handbook of science and technology studies, eds. Sheila Jasanoff, G.E. Markle, J.C. Petersen, and
Trevor Pinch, 572–597. Thousand Oaks: Sage Publications.
European Commission 2007. The European research area: New perspectives. Green Paper, Brussels,
Belgium.
Freeman, Christopher. 1987. Technology policy and economic performance: Lessons from Japan.
London: Pinter.
Georghiou, Luke, and Jennifer Cassingena Harper. 2008. FTA for research and innovation policy strategy.
Downloaded 24 April 2011 from: http://forera.jrc.ec.europa.eu/fta_2008/anchor_paper_3.pdf.
Gibbons, Michael, C. Limoges, H. Nowotny, S. Schwartzman, P. Scott, and M. Trow. 1994. The newproduction of knowledge. London: Sage.
Hessen, Boris. 1931. The social and economic roots of Newton’s Principia. In Science at the crossroads,ed. N.I. Bukharin, 151–212. London: Frank Cass. (Reprint New York 1971).
Irvine, John, and Ben Martin. 1984. Foresight in science: Picking the winners. London: Francis Pinter.
Jacob, Merle, and Olof Hallonsten. 2012. The persistence of big science and megascience in research and
innovation policy. Science and Public Policy 39(4): 411–415.
Johnson, Leland, and Daniel Schaffer. 1994. Oak Ridge National Laboratory: The first fifty years.
Knoxville, TN: University of Tennessee Press.
Laudan, Larry. 1977. Progress and its problems: Toward a theory of scientific growth. Berkeley, CA:
University of California Press.
Martin, Ben. 1995. Foresight in science and technology. Technology Analysis & Strategic Management7(2): 139–168.
Martin, Ben R., and John Irvine. 1989. Research foresight: Priority-setting in science. London and New
York: Pinter Publishers.
Moravcsik, Michael. 1973. A refinement of extrinsic criteria for scientific choice. Research Policy 3:
88–97.
Nowotny, Helga, Peter Scott, and Michael Gibbons. 2001. Rethinking science: Knowledge and the publicin an age of uncertainty. Oxford: Polity Press.
Organization for Economic Cooperation and Development. 1963. Science and the policies ofgovernments. Paris: OECD.
Organization for Economic Cooperation and Development. 1971. Science, growth and society: A newperspective. Paris: OECD.
Revisiting ‘Weinberg’s Choice’ 395
123
Organization for Economic Cooperation and Development. 2004. Science, technology and innovation forthe 21st Century. Meeting of the OECD committee for scientific and technological policy atministerial level, 29–30 January 2004 - final communique. Paris: OECD.
Salomon, Jean-Jacques. 1987. Science and government: A European perspective. In Science for publicpolicy, ed. Harvey Brooks, 27–36. Oxford: Pergamon Press.
Schmoch, Ulrich, and Torben Schubert. 2009. Sustainability of incentives for excellent research—The
German case. Scientometrics 81(1): 195–218.
Schafer, Wolf, and P. Burgess. 1983. Finalization in science: The social orientation of scientific progress(Boston Studies in the Philosophy of Science). Dordrecht: Reidel.
Shove, Elizabeth, and Arie Rip. 2000. Users and unicorns: A discussion of mythical beasts in interactive
science. Science and Public Policy 27(3): 175–182.
VINNOVA. 2002. Effekter av VINNOVAs foregangares stod till behovsmotiverad forskning - Fyraeffektanalyser av insatser under perioden 1975–2000. Stockholm: VINNOVA.
Weinberg, Alvin. 1963/2000. Criteria for scientific choice. Minerva 1(2): 159–171. Reprinted in Minerva38(3).
Weinberg, Alvin. 1964/2000. Criteria for scientific choice II: The two cultures. Minerva 3(1): 3–14,
reprinted in Minerva 38(3).
Weinberg, Alvin. 1965. But is the teacher also a citizen? Science 149: 601–606.
Weinberg, Alvin. 1986. Revisiting criteria for scientific choice. Czechoslovakian Journal of Physics B36:
195–199.
Ziman, John. 2000. Criteria for scientific choice—Commentary. Minerva 38(3): 267–269.
396 T. Hellstrom, M. Jacob
123