A ruckus in academia has been raised by the publication of the interim report (here) on the 2012 evaluation of research excellence in tertiary education organisations (TEOs).
This is the third Quality Evaluation and the second full round since 2003 in an exercise that will affect funding for the TEOs from the Performance Based Research Fund.
The NZ Herald (here) noted that the University of Auckland had been knocked off its perch by Wellington’s Victoria University in the resultant ranking of universities.
Vice-chancellor Professor Stuart McCutcheon said the second-place ranking for his institution was “apparently counter-intuitive”. But he took comfort that the PBRF had confirmed the university as the country’s leading research institution.
It had secured the largest share of the fund, with $80.4 million or 30.6 per cent of the national total, reflecting its dominance in the three core components of the PBRF.
McCutcheon said the change in rankings reflected a new method in calculation. For example, the postgraduate component divided the quality parameter by the number of postgraduate students enrolled – which had disadvantaged Auckland because it had grown its programme.
AUT vice-chancellor Derek McCormack told the NZ Herald the rankings were “spurious” because they were based on average scores.
At Kiwiblog, David Farrar headed his post Quality or quality rorting? (here).
It is pretty clear from these numbers that VUW’s ascendancy to the top of the table is more to do with their disciplined program to reclassify staff, than any actual increase in quality.
The Government has already made some rule changes in response to VUW’s rorting and rigging attempts. I think they have a bit more work to do.
As it is heavily weighted towards publications, this can be a disadvantage to those who, for example, work in commercially sensitive areas (e.g. medical & drug research) where publication is not always allowed. And with any researcher sometimes grants fail to eventuate, experiments fail or one’s personal life can throw some heavy duty curve balls which inhibit research. PBRF scores are not the be all and end all of a researcher’s worth or of his or her future capability.
Also at Sciblogs, John Pickering (here) has highlighted the way the university spin doctors dealt with the findings by showing the media statements they released on 12 April.
Pickering describes the evaluation process and argues that if the TEOs don’t score well in any of the four overall grades (comparative to other institutions their own size), then they can pick a favourable number from one of their academic units and talk about that.
He says –
Academics are notoriously competitive – obviously a good trait when it drives them to produce better and better research. I certainly have some of that streak in me. However, it is not helpful when it results in attempting to pull the wool over the eyes of the public as happened yesterday.
The PBRF is a complex system designed to find a way to allocate research funds and hopefully improve research quality. Academics will argue until the cows come home if it does this fairly. It certainly is a very expensive exercise. It certainly focusses institutions on the importance of research, which is a good thing.
Remember, the teaching in our universities (not polytechnics) is required by law to derive from research. However, as a small country where the combined size of all our universities is only that of some of the larger overseas universities I wonder if such a inter – institution competitive model is the best for the country?
Perhaps the story should be an evaluation of cost benefits of the exercise. Is this the best method of allocating funds? Such a story should also consider if the competition is enhancing or detracting from the quality of research – after all in almost any field experts are spread across institutions.
Collegiality is a major driver of good research – does PBRF hinder that?
In a foreword to the report, Steven Joyce, Minister for Tertiary Education, Skills and Employment, says –
Overall, science and innovation funding across Government will grow to more than $1.3 billion a year by 2015/16. Part of that funding is a commitment to increasing the size of the Performance-Based Research Fund (PBRF) by 20 per cent, from $250 million a year to $300 million a year, by 2016.
The results contained in this report suggest this increase in investment is warranted. In the last 10 years, the number of research staff whose evidence portfolios received a funded Quality Category has increased from approximately 4,450 full-time equivalent staff to over 6,300 full-time equivalent staff.
The PBRF has played an integral part in this significant shift.
The report says the PBRF is intended to increase the average quality of research, ensure that research continues to support degree and postgraduate teaching, ensure that funding is available for postgraduate students and new researchers, and to improve the quality of information on research outputs.
The amount of PBRF funding that a participating tertiary education organisation receives is based on its performance in the three elements of the PBRF: the Quality Evaluation, research degree completions and external research income.
Twenty-seven TEOs participated in the evaulation, compared to 33 in 2006. Participating TEOs in 2012 included all eight NZ universities; 10 institutes of technology and polytechnics; one wānanga; and eight private training establishments.
Any thoughts from NZIAHS members?