Date post: | 15-Dec-2015 |
Category: |
Documents |
Upload: | max-tirrell |
View: | 218 times |
Download: | 2 times |
Stimulating Research on Science of Science &
Innovation
Stimulating Research on Science of Science &
Innovation
Julia Lane, NSF
2
Overview
Science of Science & Innovation Current activities
– SciSIP– Examples of funded research
Future activities– STAR METRICS– International collaboration
3
Theory
Production function framework great for aggregate impacts (Haskell/Clayton)
At micro level not so clear– Discovery – innovation highly nonlinear– Unit of analysis?– Dependent on organizational systems
Establishing counterfactuals Outcome measures?
– Scientific– Economic– Social
4
Empirical
Structural– Science agencies have proposal and award administration
systems => no systematic frame of individuals “touched” by science funding
Data collection potential– Substantial investments in some areas – notably patents and
citations, and potential in others (LEHD) – but fragmented and voluntary in nature
– Ethnographic information, most obviously in biotechnology and nanotechnology
– Little chance for random assignment– Possible natural experiment with ARRA
5
Role of Evaluation
Where and how to intervene?– R&D tax credit– “innovative work force”– Broadband– Science funding in general
• Levels• Disciplines• Portfolio (geographic, riskiness, size, structure..)
Comparison to other federal investments– Health, workforce, education, climate change
6
So…here’s a chance to build out a field
United States– SciSIP program established in 2005, $8-10 million/year– Explicitly interdisciplinary – economists, sociologists,
psychologists, political scientists, anthropologists– Goals: Understanding (theories); measurement (models,
metrics, datasets); community of practice (academics, practitioners)
– Examples: Labor economics; Aging research
United Kingdom– NESTA– Department of Business, Innovation and Skills– Research Councils– New Government?
7
Theoretical Foundation: Researcher Input
SciSIP program– Understanding (knowledge and theories)
– Measurement (improve and expand science metrics, datasets and analytical models and tools)
– Community of practice(website, workshops, listserv)
75 awards made since 2007 – about $36 million direct, $10 million leveraged from other programs; fourth solicitation – over 100 proposals submitted
8
Examples of research already funded
Economics Azoulay/Graff-Zivin Superstar Scientists Hobijn/Comin Technology Adoption and Diffusion
Sociology Woody Powell and others Networks Zucker/Darby Large scale data infrastructure
Psychology Schunn Analysis of team interactions Gero Situated cognition views of innovation
Visualization
Visual AnalyticsMapping
STAR (Science and Technology in America’s Recovery)
10
Empirical Foundation: Building a framework Framework: a collection of integrated databases
• Agency records transmitted on a flow basis • University records transmitted on a flow basis
Reduce Burden on PI’s and Universities– Automated webscraping and reporting of outcomes to agencies,
state legislatures and other constituencies– Systematized, standardized and validated ongoing
measurement of long term impact of science• Economic: Patents, patent applications, new businesses• Scientific Outcomes: Creation and uptake of ideas: e.g. citations,
new fields• Social outcomes: Health, welfare, environment…
Institution STARSTAR
Pilot
ProjectAcquisition
And Analysis
Direct
Benefit
Analysis
Intellectual
Property
Benefit
Analysis
Innovation
Analysis
Jobs,
Purchases,
Contracts
Benefit
Analysis
Detailed
Characterization
and
Summary
Institution
Agency Budget
Award
State
Funding
Personnel Vendor Contractor
HR SystemProcurement
System
Subcontracting
System
Endowment
Funding
Financial System
Hire Buy Engage
Disbursement
Award
Record
Start-Up
Papers
Patents
Download
State
Research
Project
Existing
Institutional
Reporting
Agency
Examples of Research Possibilities
Modeling Policy Outcomes: Mapping Innovation Pathways
Partnership with Science Agencies and Universities
Current:– OSTP and major science agency led initiative– Actual, administratively based, externally verifiable, measures of
job creation for pilot universities– Expanding to additional universities
University faculty invited to participate in matching exercise with citations, patents, patent applications and other economic/scientific/social outcome metrics– Report to OSTP and participating agencies scheduled for March
2009
15
16
What does this entail?
Partner with Pis to– develop flow-based annual and final reports/biosketches
• http://ideas.repec.org/e/pla36.html• http://citeseerx.ist.psu.edu/
– Visualizations of networks and impact– Collaborative tagging of research outputs etc….
Partner with university administrators to develop flow-based impact of science funding
17
Ultimate Goals
Fully fledged academic field Fully fledged analytical tool set in government:
Science policy in same analytical tier as tax policy Common empirical infrastructure available to all
universities and science agencies to quickly respond to State, Congressional and OMB requests
Common scientific infrastructure for researchers to develop and study science metrics
Join the Effort
Join the listserv Send a blank email to: [email protected].
Visit the new site at the end of November and: Register and build your profile Add material to the site (publications, news, events, etc.) Comment on and rate existing material
Submit proposals to SciSIP
http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=501084&org=SBE
18