Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

New tables will lift the lid on university life

For years, universities have successfully sidestepped government attempts to establish performance indicators for what they do, but all that is about to change.

Lucy Hodges
Wednesday 28 July 1999 18:02 EDT
Comments

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Until now, naming and shaming has been done to schools and hospitals rather than universities. That's expected to change this autumn, with the publication of performance indicators for higher education. New data about to spew forth from the Higher Education Funding Council (Hefce) will let newspapers compile "official" league tables comparing universities for their drop-out rates, or how good they are at attracting students from deprived backgrounds and from State schools.

Customers - parents and their offspring - should benefit from this new welter of information. Michael Sterling, Vice-chancellor of Brunel University and chairman of the Higher Education Statistics Agency (Hesa), believes it'll be very useful to young people applying to university. "It is a major step forward," he says. Indicators will show, for example, how efficient named universities are at getting students through degree courses in the allotted timespan.

Higher education is awaiting the new dawn with some trepidation. But it thinks the publication of official statistics is at least preferable to newspaper's own league tables that have ranked them in ways they regard as misleading. "If we're going to have a league table, it's better to have an official one, than an unofficial one," says Geoffrey Alderman, head of academic development and quality assurance at Middlesex University. "The problem with The Times league table is that it is geared to a particular type of traditional university. At Middlesex, the majority of our students do not have A-levels, so to get away from that sort of inherent bias has to be a good thing."

Michael Sterling is also pleased as he regards current newspaper league tables as fairly meaningless. They contain "silly" things such as student- staff ratios, he says, which are unhelpful as now calculated because newspapers do not distinguish between staff employed as a result of research reputation and those employed for teaching students. "I am much more comfortable about league tables being drawn up from the Hefce indicators, than I am about the league tables that exist at the moment," he says. "There are enormous flaws in the way they're put together at present. Any new tables won't be perfect. I am sure there will be snags, but they will be better than they are now."

To try to get round the complaint that it's difficult to compare higher education institutions because they're so different (former polytechnics have traditionally been less well funded than "old" universities that have established formidable research reputations), Hefce is producing "adjusted sector benchmarks". These allow for differing subject mix and entry qualifications. Thus, for example, each university will receive a raw score for its drop-out rate, as well as a figure showing how it compares with other similar institutions.

"What we're saying is, `You should not really be comparing a lot of these institutions because they have such different mission statements and are taking in such different sorts of people,'" says an Hefce insider. The big question will be the extent to which newspapers will bother with the benchmarks or simply rank universities by raw scores.

Certainly not all higher education experts are happy with the idea of benchmarking or putting institutions into context, as they think it will allow universities such as Oxford and Cambridge to look better on some indicators than they might otherwise. "I don't think we want to see particular institutions getting let off the hook as far as certain indicators are concerned," says Patricia Ambrose, chief executive of the Standing Committee of Principals, which represents higher education colleges. "We don't want to see an elite `old' university getting away with having done little about widening participation."

One of the snags - for newspapers certainly - will be the volume of information published. Performance indicators have been drawn up in five areas. In some cases, there are different indicators for traditional- age students and for mature students. For example, there are three indicators for young students in the area of how successful universities are at recruiting students from disadvantaged homes. One of the indicators gives the parents' social class, the other the proportion from state schools, and the third the percentage of students from neighbourhoods from which few people go to university.

In the area covering drop-out rates, there are two indicators, both applying only to full-time degrees, one for mature, the other for part-time students. They measure whether students stick it out in higher education or whether they chuck in their studies. A separate indicator examines each institution's record at turning out students with qualifications.

The indicators look at the efficiency of learning and teaching - how good universities are at getting students through their courses in the allotted period of three or four years. Teaching Quality Assessments (TQAs) are being omitted as they're widely available on the Internet anyway, and also because Hefce wants a set of indicators it can repeat year-on- year. The TQAs are not done for every institution every year.

The Research Assessment Exercise (RAE) is not being included in the indicators to be published this year either. Again, it's partly because Hefce wanted the year-on-year comparison and the RAE doesn't provide that. Like Quality Assessment Agency figures, the RAE is widely available, so it doesn't need covering again. Instead, the research output indicator examines the number of PhDs awarded, compared to inputs such as academic staff costs and research funding.

Universities scoring high on that indicator will not necessarily be the same as the ones scoring high on the RAE. Those with only a small part of the institution producing PhDs but doing it very efficiently, could score well. They may not score high on the RAE. Similarly, those with modest research funding but producing a comparatively large number of PhDs, would do quite well too. That success might not show up on the RAE.

Two areas in which performance indicators are to be developed are on hold for now while more work is done on the methodology of collecting statistics. One is the hot political potato of graduate employment; the other is university links with industry.

Current graduate employment figures, based on statistics collected by universities, are supposed to show the proportion of graduates employed six months after graduation. Everyone, however, knows that these figures are dodgy. It all depends how the university collects them, how much effort it puts into the exercise, and whether it counts students who have not replied, and those who are in temporary work flipping burgers in McDonald's. Some universities have graduate employment figures of almost 100 per cent - but no one believes them.

The idea that universities would be rated on such unreliable figures - and that funding might come to be tied to it - caused a furore. That is why the employment indicator has been dropped for now. But the Government wants to see one produced next year because it wants to concentrate universities' minds on graduate employability.

Until now, league tables have been compiled by a couple of newspapers out of data from the Research Assessment Exercise, which rates universities for research; the Teaching Quality Assessments, which rate them for teaching; and from the Higher Education Statistics Agency, which pours out a multitude of raw statistics. The motivation behind the league tables has been to provide useful information to newspaper readers to help them in selecting universities.

For years universities have successfully sidestepped government attempts to establish performance indicators because they were fearful of being labelled winners or losers. Instead they established "management statistics" which avoided ranking universities. Now, however, they can hold out no longer.

The reason why the Department for Education and Employment and the Treasury want to see performance indicators is to find out whether taxpayers' money is being well spent and to prod universities into becoming more efficient. Government wants to be able to compare institutions, and the DfEE, in particular, wants to put pressure on universities to implement its cherished policies. The consumers' interests are fairly incidental in this except insofar as consumers are also taxpayers.

Hefce believes its indicators will be much more useful for year-on-year comparisons of the same institutions than for comparison between institutions. According to this view, they will, therefore, be useful to managers of universities and those in government who want a helicopter view of the system to ensure government policy is being carried out. And they will be of less use to consumers. But they are bound to be used by consumers.

Newspapers are likely to publish league tables on the grounds that this is official data ranking universities on a variety of criteria. Mike Milne- Picken, planning officer at the University of Central Lancashire, is afraid newspapers will pick and choose the criteria on which they base league tables. For example, they might choose to highlight drop-out rates rather than the extent to which universities are recruiting students from deprived backgrounds. That would favour elite universities taking students with high A-level scores rather than new universities who take many without A-levels.

"Our figures show that we have much higher recruitment of students from disadvantaged groups than the sector norm, and not such great progression rates compared with the universities that don't have such high recruitment of those from deprived backgrounds," he says.

e-mail: lucy@scribbl.demon.co.uk

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in