You are currently viewing Postgraduate quality will not be taken seriously until there is a national PG taught survey

Postgraduate quality will not be taken seriously until there is a national PG taught survey

  • Post comments:0 Comments


It’s time for the Office for Students (OfS) and the sector to take postgraduate taught education seriously and give it the policy attention it deserves.

I’ve been working in higher education for over 30 years and one of the enduring constants over that time is the almost complete lack of policy debate about postgraduate taught students and their experience – somehow they always seemed to get lost.

There is occasionally some discussion about diversity – although more recently this has sadly been about the reliance on Chinese students to sustain many universities rather than postgraduate study as a driver of social mobility.

A good example of this is the publication of B3 data by OfS – the first time that a comprehensive set of performance indicators on taught postgraduate students, despite the equivalent undergraduate measures being around since the late 1990s.

The same is not true for postgraduate research students who have always had their needs championed by UKRI, and before that the individual research councils, who rightly see them as the next generation of research talent.

Back in the day HEFCE even dabbled with projected completion rates for research students – but these were fraught with difficulty yet still they took precedence over similar, much easier to calculate and more robust, figures for taught students.

Two out of three

The publication of B3 data by OfS (at least in England), and the long-standing publication of the Graduate Outcomes survey by Jisc, means that there is just one glaring omission in the postgraduate taught student landscape – any attempt to ask students about their experience.

There is currently no mandated national survey for postgraduate students despite the fact that 38 per cent of all entrants each year are to postgraduate programmes. How can it be right that such a large group of students have no voice and we know so little about their experience?

The first serious discussion of a postgraduate equivalent to the National Student Survey (NSS) appeared in the 2010 white paper Students at the Heart of the System – and in 2013 the UK funding bodies started work on a possible national survey, getting as far as quite detailed pre-consultation during 2017.

OfS continued this work, and as recently as 2022 commissioned a second pilot survey, but since then nothing has happened.

Thanks to the good work of Wonkhe we have some headline figures from those pilot exercises. Those figures show that in aggregate students are positive about their experiences with 86 per cent of respondents responding positively to the question “Overall, how would you rate the quality of your course?”, something that is sadly unlikely to ever make it into a survey run by OfS for what can only be described as spurious reasons.

PTES out

Of course, we do already have a national survey which has been run for many years by Advance HE. This survey again highlights an overall positive experience for postgraduates at those universities that choose to participate, 83 per cent in the latest iteration.

There is very little wrong with this survey itself other than the timing – which means many students will have only started dipping their toe into the dissertation by the time they complete the survey.

AdvanceHE publishes a quite detailed analysis of the survey every year and provides benchmarking data to participating universities. The real problem is that the detailed university and subject within university figures are not published and participation is optional, although over a hundred universities participated in 2023.

The lack of detailed publication is not AdvanceHE’s fault, these features are necessary when you have no power to compel participation. Very few universities would risk participation in a voluntary survey if the results were public. The lack of telephone responses also hinders response rates which could surely be improved with a multi-mode survey.

Reputational damage

Way back in 2005 when the UK funding bodies ran the first NSS, many questioned the value of asking for students’ views on their courses and some actively disagreed with the findings – it wasn’t that feedback it was poor, it was final year students failing to recognise good feedback!

There were also significant concerns about potential damage that the survey would do to the UK’s international reputation. As the twentieth iteration of the survey ends and we eagerly await the results it is now accepted as a valuable and enduring survey that has focused minds on the undergraduate student experience.

There are those who detract from NSS arguing that the concepts captured do not necessarily reflect the current academic literature on what makes for a high-quality student experience. However, I would argue that, with a few exceptions driven by political motives, the questions focus on topics that are, and should be, important to students.

I have yet to see a compelling argument as to why it is not important that staff are good at explaining things or that students are able to contact staff when they need to.

During early discussions about a postgraduate survey many in the sector feared the untold damage that could be done to international recruitment through the UK being one of the few countries with such a survey. Based on the OfS pilots and the AdvanceHE survey there is good evidence that the score of any national survey would be sufficiently strong so as to support recruitment rather than undermine it.

There is also no evidence that the existence of NSS has harmed our recruitment of international undergraduates at the national level – and I see no reason why this pattern should not be followed for postgraduate study.

Of course, there may be some subjects at some universities where the results are poor and this damages their recruitment – but it is hard to argue that a mechanism that penalises courses with a poor student experience is a bad thing.

Arguably those universities upping their game will help UK higher education in the global marketplace.

The rational choice

Throughout its history, NSS has been framed as a vehicle to drive choice, as part of a deeply flawed assumption that students make rational choices of where to study based on a detailed analysis of vast swathes of data.

All of the research that I have seen shows this simply isn’t the case – with decisions more likely driven by the heart than the head, allowing data to be used as a hygiene factor to rule out poor quality courses or confirm high quality ones.

I would argue that far from being irrational this is the height of rationality. A prospective student planning on spending three or more years of their life somewhere and incur significant liabilities on their future earnings had better be sure it’s somewhere I feel at home. At the end of the day the differences between many universities on the metrics are, in practical terms, quite small, it is the outliers students need to be aware of.

Similar research on the way postgraduates make choices indicates that they are heavily influenced by factors such as research strength and detailed course content, with many choosing programmes based on single modules led by world leading academics. A national postgraduate taught survey would therefore be unlikely to radically change the way students choose their courses.

But that doesn’t mean it is a waste of time and money.

The impact on the reputation of universities or faculties of poor results in such a reputation sensitive sector will be such that senior leaders will take notice of poor outcomes – and will seek to improve them. Of course, many universities already do this by participating in the AdvanceHE survey, but arguably they are not the universities that need the prodding to take the postgraduate student experience seriously.

Fourteen years on from the white paper, it is time that OfS and its counterparts in the devolved administrations stepped up to the plate and delivered a universal survey of postgraduate taught students, with detailed and benchmarked data published. Better still, OfS could actively include results of the student surveys in their regulation of providers and truly become the office for students.



Source link

Leave a Reply