Editorial- Research Review 2013
Professor Brett Delahunt, Editor
The latest round of Performance Based Research Funding (PBRF) for universities was finalised in late 2012, with the results released in early 2013. The PBRF programme assesses universities according to a variety of research achievements, quantified as research degree completions and external research funding income. In addition to this, individual staff are assessed on their research performance, with specific categories being the number and quality of publications, contributions to the research environment and peer esteem. Although the scores of individual staff members remain confidential to the participant, grouped scores are released to the universities and are utilised to determine an average quality score for each tertiary institution.
For the record, the 2012 round resulted in the University of Auckland being ranked first for research degree completions and for achieving the highest level of external funding, and the staff of Victoria University of Wellington achieved the highest Average Quality Score. While the 2012 PBRF round resulted in winners and losers, many participant universities were able to put a particular spin on the results to indicate that they were well placed in the academic environment and in general, the results of the round were well received.
PBRF rounds are conducted on a six year rotation and are an enormous undertaking, both for participant tertiary institutions and the Tertiary Education Commission (TEC). For the 2012 round 6,757 portfolios were submitted and these were examined by 12 panels and a total of 309 assessors. Considerable effort is required to compile a portfolio and the quality of the finalised product can have an impact upon individual scores. Such is the importance of the content of the portfolio, it was noted by the TEC, that for 2012, some researchers failed to provide sufficient evidence in their portfolio to allow the awarding of an A grade, which would have otherwise been earned through the quality of publications. In order to circumvent this universities have provided coaching relating to portfolio preparation and have undertaken internal assessments of submitted portfolios, well before the formal PBRF round.
While improving skills in portfolio writing, for many the trial portfolio is of limited value. An individual PBRF portfolio contains a full list of publications produced during the preceding six year interval. The staff member is also required to select four nominated articles and write a commentary on each, emphasising their importance. For many the nominated research outputs chosen for trial assessment are not necessarily those finally submitted, as more important works may be produced closer to the cut off period for portfolio submission. This means that valuable time is consumed in trial portfolio production and it could be argued that this would be better employed in research activities. The assessment of portfolios also consumes potential research time as assessors are required to evaluate each portfolio and assign a score to each output category. This is of particular importance as members of assessment panels are senior academics, usually with heavy research commitments.
In the report resulting from the 2012 PBRF round it was noted that, in comparison to the 2003 round, the number of A grades (the highest category) increased from 9.5% to 13.2% of submitted portfolios, the number of B grades increased from 38.6% to 40.1%, while the number of C grades decreased from 51.9% to 32.0%. This would imply that over this period, there has been an increase in research quality.
Is this apparent quality improvement reflected in overall performance of the tertiary sector? Recent reports would suggest that this is not the case. The QS World University Rankings, based upon a variety of criteria that include academic reputation, employer reputation, faculty student ratio, citations per faculty, international students and international faculty, were released in September 2013. In virtually all cases the ranking of New Zealand universities showed a decline over the previous year. Auckland University topped the rankings, but dropped from 83 in 2012 to 94 in the current round. The other three leading New Zealand Universities showed a similar decline in rankings, with Otago dropping to 155 from 133, Canterbury to 238 from 231 and Victoria University to 265= from 237. While changes in ranking may be relative and could indicate that the rate of improvement amongst New Zealand universities cannot match that of some overseas institutions, the stark reality is that we are falling behind our international counterparts. There are several possible reasons for this; however, the most likely cause is a significant decrease in tertiary funding in this country. As has been argued in previous Research Review editorials, a decline in research funding in New Zealand is palpably obvious and there continues to be a high demand for financial support from non-governmental sources. This demand is reflected in the increasing number of funding applications received by the Foundation, which frequently exceeds available resources. The Foundation is committed to the promotion of research locally and while it does provide significant research assistance, funding must often be prioritised. It is clear from this that infusion of funds is an on-going requirement for the Foundation. The current state of research funding is such, that this requirement will only increase, if we are to continue to provide meaningful assistance to researchers within the Wellington Region.
Professor Brett Delahunt, Editor
Research Advisory Committee Membership
Professor Brett Delahunt (Chair)
Dr David Ackerley
Dr Peter Bethwaite
Associate Professor Duncan C Galletly
Dr Rebecca Grainger
Dr T William Jordan
Associate Professor Ann La Flamme
Professor Graham Le Gros
Professor John H Miller
Dr Kyle Perrin