Lest I am accused of publishing only good news on these pages, this week I can report that NU has dropped significantly following publication of the 2013 THE World University Rankings.To be precise, we have fallen from 180th position in 2012 to 198th in 2013.
The quantitative scientists and engineers amongst us will immediately be aware that such a fall may not be statistically significant – ie it is ‘in the noise’ – but that is unfortunately not the point. The general populus, in these days of soundbites and social media seem unconcerned by such details, and indeed international governments are increasingly using league tables (including THE) to determine where they will (and will not) send their students. A drop to 198th place is a really big deal for us since the cut-off is typically 200th place.
Inspection of our score shows where we have lost ground. Our total score is determined by performance in five areas. These are as follows for 2013, with the 2012 score in brackets: teaching 29.7 (37.9); international outlook 76.3 (74.4); industrial income 36.9 (37.4); research 28.3 (30.5); citations 68.1 (72.1); overall score 44.5 (48.6). It is clear from these data that we have performed worse in every area except international since last year. So what can we do about it? Cynically, we could devise a strategy aimed at improving our scores in each of these areas by addressing them directly. In my view this would not be a sensible approach, for the simple reason that the methodology is likely to change in the future. Several universities have endeavoured to build a vision on the concept of being top n‘th in the world league tables, only to find that they have in fact dropped when the methodology changed. Does this make them lesser universities overnight? Of course not.
A much more sensible approach in my opinion is to focus on excellence in all we do, and let the league tables take care of themselves. In fact the lion’s share of the score in the THE methodology derives from research related performance, and as I have boringly repeated in this blog in the past, our research performance is somewhat off the pace of many of our peers. As we approach the REF 2014 census date, dare I say that we need to maintain the momentum in preparing for the next REF, with the aim of securing an exceptional result. To that end, Faculty Executive Board will soon discuss the ways in which we can accelerate our research performance, with a particular emphasis on the barriers to doing so in individual schools.
Here’s a final thought. Another aspect of the THE methodology concerns a ‘reputation survey’. A random group of peers who are listed in The Web of Science are asked to rank the top 15 institutions in a particular discipline, and these opinions contribute 33% of the score. Last year 16,000 academics responded to the survey. Why not try this for yourself – think of the top 15 universities for your discipline, then think about why you chose them. When I try this for my discipline (loosely, chemistry) I conclude that it is ‘impression marking’, based on historic reputation than any tangible metric. I hope I’m in the minority.
How can we get NU into the top 15 of our peers’ rankings? Answers on a postcard please.