+ All Categories
Home > Documents > A Bibliometric Comparison of the Research of Three UK Business Schools

A Bibliometric Comparison of the Research of Three UK Business Schools

Date post: 05-Jan-2016
Category:
Upload: vinson
View: 27 times
Download: 0 times
Share this document with a friend
Description:
A Bibliometric Comparison of the Research of Three UK Business Schools. John Mingers, Kent Business School [email protected] March 2014. 1. Overview. Introduction The Leiden Methodology Criticisms of the Leiden Methodology The Empirical Study – Three UK Business Schools - PowerPoint PPT Presentation
13
A Bibliometric Comparison of the Research of Three UK Business Schools John Mingers, Kent Business School [email protected] March 2014
Transcript
Page 1: A  Bibliometric  Comparison of the Research of Three UK Business Schools

A Bibliometric Comparison of the Research of Three UK Business Schools

John Mingers, Kent Business School

[email protected]

March 2014

Page 2: A  Bibliometric  Comparison of the Research of Three UK Business Schools

1. Overview

• Introduction• The Leiden Methodology• Criticisms of the Leiden Methodology• The Empirical Study – Three UK Business Schools

• Overview of publications and Citations• Differences between Fields• Results of the Methodology

• Conclusions

Page 3: A  Bibliometric  Comparison of the Research of Three UK Business Schools

1.2. Introduction

• There has been an ever-increasing desire to monitor and measure the quality of a department’s research

• Apart from peer review, the main method for doing this is bibliometrics, generally based on measuring the citations received by published papers (citations per paper – CPP)

• However, there are major differences between the citation patterns of different fields and this needs to be taken into account in any comparisons.

• This means that the data must be normalised with respect to field, and time

• The Leiden ranking methodology (LRM) is one of the most well developed methodologies for comparing departments.

• It has been used mainly in the sciences where Web of Science (WoS) data is better. This study is one of the first tests with social science departments.

Page 4: A  Bibliometric  Comparison of the Research of Three UK Business Schools

3. 3. The Leiden Methodology (crown indicator) Leiden methodology (crown indicator)1. Collect all papers from a department over a specific time window, e.g.,

5 years2. Use the Web of Science to find the citations for each paper (assuming

it is included in WoS)3. Calculate how many citations such a paper would expect to receive

given its field and year of publicationa. WoS has field lists of all the journals in a particular field. From this

we can find the total citations to papers published in that field in that year, and divide by the number of papers giving the field citations per paper (FCS)

b. We can calculate a similar figure just for the set of journals the department actually publishes in (JCS)

4. Total the actual number of citations for all papers and the expected number of citations for all papers. Dividing give the crown indicator – ie the average citations per paper relative to the field average (CPP/FCS)

Page 5: A  Bibliometric  Comparison of the Research of Three UK Business Schools

1. 5. The value may be :• > 1 – the department is doing better than the field• = 1 – it is average• < 1 – it is below average• Very good departments may be 2 or 3 times the average

6. Can also calculate JCS/FCS which shows if the department is targeting better or worse journals than the field as a whole

7. This methodology has been criticised for its method of calculation – should you sum the citations before dividing, or do the division for each paper and then average• The first method can cause biases eg in favour of publications in

fields with high citation numbers and it is now accepted that the second method is correct

Page 6: A  Bibliometric  Comparison of the Research of Three UK Business Schools

4. Criticisms of the LRM

1. The use of WoS for collecting citations• WoS does not include books either as targets or for their

citations • Until recently it has not included conference papers• It does not include many journals• The coverage is differential across disciplines – science is good

(> 90%), social science is mediocre (30%-70%), arts and humanities is poor (20%)

2. The use of field lists from WoS which are ad hoc and not transparent3. Bibliometric methods should not be used by themselves but only in

combination with peer review and judgement4. Are citations a good measure of quality, or just of impact?

Page 7: A  Bibliometric  Comparison of the Research of Three UK Business Schools

5. The Study – Three UK Business Schools1. Three business schools were selected, primarily because of the

availability of the data but they were reasonably representativeA. New school but in a very prestigious universityB. New school in a traditional university undergoing rapid

expansionC. Older school in modern university aiming to become more

research intensive

TABLE 1RESEARCH OUTPUTS FOR THREE SCHOOLS

Years covered by outputs

Staff in the 2008 UK RAE

Authors involved in outputs

Total outputs

Total journal papers

Total outputs in Web of Science

School A 1981-2008 45 816 1933 705 403

School B 1984-2009 39 675 1455 629 309

School C 1980-2008 39 461 1212 548 292

Page 8: A  Bibliometric  Comparison of the Research of Three UK Business Schools

TABLE 3A COMPARISON OF RESEARCH OUTPUTS ACROSS SCHOOLS

School A School B School CISI WoS subject areas Number of

publicationsNumber of journals Number of

publicationsNumber of journals Number of

publicationsNumber of journals

Agriculture* - - - - 30 10Business 76 34 34 18 32 19Business Finance 14 5 11 4 2 2Computer Science* 17 10 3 3 20 14Economics 60 33 37 21 59 23Engineering 14 10 17 7 20 10Environmental Sciences

25 9 2 2 11 6

Environmental Studies - - 6 3 12 7

Ethics 10 2 - - 1 1Food Science Technology

- - 2 2 15 4

Geography 10 4 7 5 4 4Health Care Sciences & Services

3 2 32 6 - -

Management 135 46 79 29 78 24Mathematics Applied 12 7 6 2 30 13

Operations Research & Management Science

20 10 6 3 60 14

Pharmacology Pharmacy

1 1 10 2 - -

Planning Development

14 7 10 5 3 4

Political Science - - 15 5 2 2Public Administration 4 4 11 5 9 4

Social Sciences* 21 7 14 10 10 6Others ( <10 Publications/ Field)

30 54 38 90 24 44

Journal papers not in WoS

302 320 256

Page 9: A  Bibliometric  Comparison of the Research of Three UK Business Schools

TABLE 4EXPECTED CITATIONS PER PAPER BY FIELD AND PERIOD

Business Economics Management

2001-2004 0.73 0.83 1.052002-2005 0.84 0.88 1.162003-2006 0.99 0.92 1.292004-2007 1.06 0.95 1.382005-2008 1.30 1.06 1.70% change 78% 28% 60%

Page 10: A  Bibliometric  Comparison of the Research of Three UK Business Schools

TABLE 5LEIDEN INDICATOR FOR THREE UK SCHOOLS FOR SEVERAL TIME PERIODS

P C CPP Total papers of journals

Total cites of journals

JCSm CPP/ JCSm

CPP/ FCSm

MNCS JCSm/ FCSm

2001-2004 A 108 193 1.79 17,324 15,682 0.91 1.97 1.95 2.03 0.99B 60 94 1.57 11,359 9532 0.84 1.87 1.69 1.70 0.91C 56 55 0.98 11,155 8,975 0.80 1.22 1.07 1.03 0.872002-2005 A 124 242 1.95 17,528 18,907 1.08 1.81 1.91 1.90 1.05B 50 53 1.06 10,454 10,154 0.97 1.09 1.03 1.07 0.95C 70 51 0.73 13,008 10,954 0.84 0.87 0.72 0.73 0.832003-2006 A 121 235 1.94 20,712 24,585 1.19 1.64 1.74 1.67 1.06B 61 80 1.31 11,809 12,224 1.04 1.27 1.15 1.20 0.91C 75 101 1.35 13,970 13,661 0.98 1.38 1.22 1.24 0.882004-2007 A 143 299 2.09 24,110 32,521 1.35 1.55 1.77 1.73 1.14B 65 102 1.57 12,661 14,724 1.16 1.35 1.31 1.34 0.97C 94 150 1.60 16,389 16,785 1.02 1.56 1.34 1.40 0.962005-2008 A 118 346 2.93 23,642 41,520 1.76 1.67 2.05 2.01 1.23B 60 162 2.70 10,111 12,954 1.28 2.11 1.87 1.97 0.89C 79 190 2.41 16,114 20,943 1.30 1.85 1.70 1.71 0.92

Page 11: A  Bibliometric  Comparison of the Research of Three UK Business Schools

6. Main Results1. Raw (un-normalised) CPP: A is always better and is rising. B and C

alternate but are also rising especially in 2005-8 (before the RAE)

2. JCS/FCS (quality of journal set): generally around or below 1, showing the journal set is average for the field, but it rises for A (0.99 – 1.23) showing that A is improving the quality of journals.

3. CPP/FCS (the crown indicator): all schools above 1.0 so better than the field average but not hugely (2 or 3). A actually fell before rising. Could this be because they were targeting better journals and competing against a stronger field?

4. MNCS (the alternative calculation): only marginal differences

Page 12: A  Bibliometric  Comparison of the Research of Three UK Business Schools

5. Do we gain anything by the normalization?• A has improved the quality of its journal set• A appeared to improve over all years, but in fact fell back so

this was really a field effect• Would allow us to compare schools with very different

subject mixes, or departments from different subjects

6. Comparison with 2008 RAE results (GPA out of 4.0):• A – 3.05• B – 2.45• C – 2.50

So good agreement

Page 13: A  Bibliometric  Comparison of the Research of Three UK Business Schools

7. Conclusions

1. The Leiden methodology can be applied in the social science to business schools but it requires a huge amount of data collection/ processing. It would have to be automated.

2. The results did provide a limited amount of extra value over raw scores.

3. But there are major problems in using WoS• Only 20% of the outputs of the schools could actually be evaluated• The WoS field categories are poorly defined

4. At the moment the LRM is NOT suitable for assessing business schools research performance. Perhaps Google Scholar could be used although that is unreliable and has no field categories.


Recommended