How to come top in the Guardian subject league tables
The Guardian subject league tables have always been a bit of a mystery. I admit, I only look at my particular subject area. And when I say ‘look’, it’s more of a glance because someone has mentioned it in the corridor, or at a meeting. The mysterious methodology used to herald a particular institution as top in a subject, means looking at the overall ranking of institutions is a stacked cluster-suck of not very complicated maths, combined with unfathomable decisions, which ultimately lead to a reaffirmation of the establishment.
Here’s the league table for my subject; Materials Science:
You’ll notice the top institution, and you should congratulate them, for they are the best at, erm, let me find the definition of what these league tables are supposed to reflect…oh yes, here it is. Well done Oxford for being the best at teaching materials. Here’s the explanation from the Guardian: ‘Eight statistical measures are employed to approximate a university’s performance in teaching each subject’.
No person lecturing within a subject really believes any of these league tables (right?), of which there are many. But what if we look more closely at the individual scores. The first three measures are of satisfaction. This is based on responses from the actual students studying at each institution.
Reordering the table, firstly on the proportion of current students who are satisfied with the course:Well done Leeds! You’ll notice the number one institution for ‘performance in teaching‘ is bottom-but-one.
And now reordering based on the proportion of current students satisfied with the teaching on their degree:Congratulations Swansea. My own institution. But I’m not going to shout about it, because, yes, there are huge problems with any league table or metric. Also, you’ll notice the number one institution for ‘performance in teaching‘ is dead bottom.
Let’s reorder based on the proportion of current students satisfied with the feedback they’ve received during the degree:Pat yourselves on the back Sheffield Hallam. Perhaps you’ve also noticed the number one institution for ‘performance in teaching‘ does a little better here, only coming fourth from bottom.
Those are the only measures in this entire league table that are based on responses from actual students studying materials at the listed institutions. This is also a good time to point out that there are biases and problems with surveying students and getting too carried away with the results and rankings. This isn’t an attack on Oxford, as I knowtheir degree is terrific, and the academics there will probably put as little stock in this league table result as I do.
If the three measures I’ve looked at above have the ‘top’ institution near the bottom, then what could be pushing them to be the number one institution for ‘performance in teaching‘? Let’s look at the other measures.
Firstly, ‘student to staff ratio’. The number one institution for ‘performance in teaching‘ comes third in this…pat on the back. The Guardian have provided a handy caveat for this.
– Caveat: This measure only includes staff who are contracted to spend a significant portion of their time teaching. It excludes those classed as “research only” but includes researchers who also teach, even though at research-intensive universities research can take up a significant proportion of their time. It therefore follows that the simple ratio of the number of staff to students does not accurately reflect teaching intensity and also does not reveal who is performing the teaching. Is it the world-renowned professor or a graduate teaching assistant?
Ok, that’s interesting. And good that the caveat is there, so perhaps you’d give that category a low weighting?
Now the ‘spend per student’. The number one institution for ‘performance in teaching‘ is top in money. Top by a relative mile. Good. Money is good. But seriously, investment in the undergraduates is important. It would be interesting to see how this is derived. Does it mean those students got brand new, expensive facilities? Is ground rent high? Did they all get shiny laptops?
On to the ‘average entry tariff’. The number one institution for ‘performance in teaching‘ is top in this. By quite a margin. Good stuff. The students coming into the university have high entry qualifications – on entry. Before they do the course. I guess that’s a measure of the ‘performance in teaching‘ of the course by assessing the entry qualifications of the cohort.
And finally, the ‘value added’ score. The number one institution for ‘performance in teaching‘ is top in this. I think institutions taking students with lower entry grades did well on this because there was scope to add value. How did the poor number one institution for ‘performance in teaching‘, with their super high ‘average entry tariff’, add value, I wondered. Then I read the methodology:
– Based upon a sophisticated indexing methodology that tracks students from enrolment to graduation, qualifications upon entry are compared with the award that a student receives at the end of their studies.
M’kay, well that’s going to be tricky for the number one institution for ‘performance in teaching‘ to demonstrate. Wait, there’s more to the methodology:
– We always regard students who are awarded an integrated masters as having a positive outcome.
Thankfully, the number one institution for ‘performance in teaching‘ only offers integrated Masters in materials.
I’ve left out ‘career prospects’ because so few institutions returned information for this.
All of these measures, including the ones derived from actual student responses, have their caveats and downfalls. So the Guardian league table have weightings associated with each one:
From this, it’s clear that responses from current students are given the lowest weightings, and the measures that come with caveats from the people who devised the methodology are given the highest weightings.
I’m not a fan of rankings because I’m terrified that our subject/field is misrepresented, or the full opportunities aren’t presented to students. I’m sad that students who are still studying for their entry qualifications may put too much stock in league tables when visiting the departments, seeing the facilities, and talking to the staff may give them more information and present the opportunities to them. This too has it’s problems. Students who don’t have as much family support as others (which could include a necessity for family members or prospective students to work on visit days; typically weekends), or have less disposable income to travel around the country visiting universities.
Providing prospective students with information is important, but are league tables the way to do it? My thoughts are no.
I did promise a guide on how to come top in the Guardian subject league tables. So in light of the information I’ve found in the methodology, to become the number one institution for ‘performance in teaching‘, departments should:
- Focus less effort and time on ensuring current students are satisfied with things like course content, teaching, and feedback.
- Identify most of their staff as ‘research-only’ for the purposes of this league table, reducing their student-staff ratio, which gets multiplied by a large weighting.
- It also helps if they are rich, spending lots of money per student registered. This is multiplied by a low weighting though.
- Have really high entry requirements. This gets multiplied by a large weighting.
- Only offer integrated Masters, rather than the more flexible Bachelors, with funded Masters (This model can provide opportunities to students who’s circumstances mean they continue studying at PG level but have an income to support them and their families).
This article was mostly fuelled by snark, and does not represent the view of my employers or any other institutions/bodies I’m affiliated with.