The standard procedure every Year 12 and Year 13 student goes through follows thus: Google “university ranks UK for insert subject choice”, black out the universities whose entry requirements are way out of your league, black out anything in Wales, and select the top five universities that are left.
Even the most thoughtful and cautious of us are guilty of having done this at least once. Three popular rankings for universities in the UK are published annually: The Times Good University Guide, The Guardian, and The Complete University Guide. However, these tables are fallible. Let us go through them one by one.
The Times (www.thetimes.co.uk/tto/public/gug/)
I must start with a warning that you must pay to view whole articles on The Times online, including the Good University Guide section and its league tables. That could deter a few of you already; however, missing out on The Times’ ranking angle might not be such a bad thing.
It considers the following factors, in order of weighting: A-Level/Higher points, teaching excellence, research quality, peer assessments, unemployment, and lastly, student satisfaction. A-Level/Higher points attempt to measure student quality, and the institution’s entry standards; it falls short because some universities have very international demographics, with students holding very different qualifications. St Andrews, Scotland’s oldest university, has 1,500 international undergraduates comprising 30 percent of its total student body. They lose out in this measurement. In addition, a school’s ability to select smart undergraduates tells nothing about said school’s ability to teach.
Teaching excellence appears to be a stable element, as reliable as any other attempt to quantify teaching excellence, namely because it tries to incorporate them all.
Research quality and peer assessment should be paid the least attention when comparing universities. Peer assessment is as subjective as it gets, and as for research quality, you attend a university to learn and be taught, and the people who do the teaching tend to be completely different to the lab rats on the institution’s pay roll who focus purely on producing world-class research papers to lift rankings.
Unemployment is a really important one. If you shell out up to GBP 51,000 for a three year degree, you’ll want a job six months after graduating. Unfortunately, it is weighted half as heavily as research quality in The Times rankings, which is a little illogical. However, it is also not that insightful. Unemployment is classed as in neither active employment nor postgraduate study – and anyone can continue to post-graduate study if they so wish; it reveals nothing of the undergraduate course’s actual value. In fact, many students choose to take a Masters Degree purely because graduate prospects are so bad. To gain a shrewd impression of employment prospects, try to find pie charts or numbers on university websites detailing the percentage of students who are employed within six months of graduating opposite those continuing further education and those who are unemployed and seeking employment.
Student satisfaction is handy to consult; you’re signing up three years of your life to one college, so feeling satisfied with it is vital. However, surveys are fallible and heavily subjective, and there are only so many graduates The Times can consult – which means they may not be a representative sample group.
The Guardian has a more comprehensive university ranking system. It uses the following criteria, in order of weighting: entry score, job prospects, staff/student ratio, value added, teaching quality, and feedback. Entry score measures the average UCAS tariff point score of first year undergraduates. However, The Guardian admits that this average is formed from as few as eight sample students. Chances of accurate representation are low.
Job prospects has the same benefits and flaws as The Times’. Staff/student ratio is new, and a very underrated but important criterion. Universities like Anglia Ruskin, with 22.7 students to every member of staff, suffer from impersonal learning. Teachers tend not to know students’ names, and cannot pay individual attention to each person’s learning. Both academic and personal development are very independent. Universities with lower student to staff ratios are capable of catering to inviduals’ needs. However, this is very much a question of personal preference. There are those who are partial to the US system where students learn primarily through large lectures, and learning is independent without much tutorage. They would do well to ignore this facet of the ranking.
The “value added” factor is unique to The Guardian. It is interesting, and perceptive of the quality of education that you can expect to receive, comparing undergraduates’ entry qualifications to their actual degree results. This tells you how much the university helped the student grow intellectually throughout the course. It’s a lot more insightful as to the schooling’s actual quality than entry standards – colleges which can take a class of predicted Thirds and turn them into Firsts over three years teach better than colleges who select the best and provide average teaching. After all, university isn’t about getting into the hardest school – it’s about edifying yourself. However, this measurement is also flawed. Accepted “top” universities like Oxbridge have much lower percentages of graduating Firsts than other high ranking universities, because their courses are that much tougher, and their grade boundaries that much higher – but that doesn’t mean their students are less academically proficient.
Teaching quality is similar to The Times’ teaching excellence, except it draws on only one national student survey – the National Student Survey. This survey does touch on many different aspects of the teaching experience, however, so I would say that it’s reliable.
Feedback measures the quality of feedback and information that students are given throughout the course by the faculty. This is collected by the National Student Survey in a similar fashion to teaching quality. It’s a very small aspect of The Guardian’s ranking, but is quite helpful for those looking for a well-guided education.
The Complete University Guide (www.thecompleteuniversityguide.co.uk/league-tables/rankings)
Probably the most popular of all national university rankings, The Complete University Guide works exclusively on producing annual league tables and analyses. It considers the most criteria out of all of them, adjusted using the complex sounding “Z-transformation”: student satisfaction, research quality, completion, student to staff ratio, graduate prospects, entry standards, good honours, academic services spend, and facilities spend. Student satisfaction and research quality are weighted heavier than the other elements.
Student satisfaction is actually taken straight from the National Student Survey’s reviews of teaching quality, as used by The Guardian, and is reliable. Research quality, however, is just as unreliable as in The Times Good University Guide. A great and unique measurement of schooling quality is completion, as it tells you the percentage of students who failed to graduate their degree, which is essentially your chances of becoming a dropout if you attend that school. Student to staff ratio, graduate prospects, and entry standards are the same as The Guardian’s equivalents. The proportion of firsts and upper seconds in the graduating class is measured by good honours, a less insightful variant of The Guardian’s value added.
The last two aspects are best ignored. Academic services spend and facilities spend are extremely trivial criteria, easily used by universities to crawl up the ranks by spending on new swimming pools, refurbished dormitories, and buying new chairs for classrooms. This is a rampant problem with the US college ranking system, and really ought to be taken out of all ranking tables.
Overall, if I were forced to recommend one ranking to consult, I would say The Guardian University Guide. Its value added criterion is particularly discerning.
However, all three are better in a way than international ranking systems, such as the Academic Ranking of World Universities, QS World University Rankings, and Times Higher Education World University Rankings. Those are even more heavily reliant on college spending and research quality, with the addition of other superficial criteria: citations per faculty, number of alumni prize winners, proportion of international intake, etc. Nationally exalted London School of Economics is in the top five across all three national guides, but falls to 328th on the global stage. As a specialist in the social sciences, their number of alumni nobel prize winners and citations per faculty cannot compete with multi-faculty STEM universities, their small campus will obviously have less college to spend on, less faculty to cite, and fewer alumni to win prizes.
So other than the university guides mentioned insofar, what else should you consult? Well, if you have already found a passion and know which subject you wish to major in, then it’s a good idea to look up the members of faculty at universities that interest you. Faculty websites ought to have detailed bios of all teaching staff. Choose a university where you identify qualified, competent teachers with specialties or passions that align with your own.
Last but not least, it’s imperative to consider campus life and location – your university campus will be your base for your entire life for three years, not just your education. And though it goes against what you probably hear all the time, life is not all about education.
Photos: Courtesy of The Guardian University Guide.
Amy is a Year 13 student at the British School of Beijing (BSB). Having grown up in The Hague in the Netherlands, she has lived in Beijing for four years and hopes to share her views on current affairs through her blogs.