Since 2003 in the UK HE institutes have been required by the Higher Education Statistics Agency to collected the DLHE survey. Replacing the old first destination survey DLHE is a census which aims to record the destination of students on one particularly date of the year approximately six months after students finish their course. The importance of DLHE has grown through the central place various newspaper league tables put on the survey and the requirement that data from DLHE be listed next to courses on a universities on website and on a central government website used to support young people and their parents in choosing which university to study at.
DLHE is therefore is a key factor in the reshaping of HE careers provision. On the one hand it has radically increased the focus on careers at university. While some universities may have previously looked at the careers service as akin to the university counselling service, they for some students, some of the time, not the place you want all of your students engaging with all of the time, it is now seen as a key tool to drive up league table position varying in importance with research grants and undergraduate degree result. But on the other hand it has been seen by some as re-forming careers work in a less than desirable manner with it’s focus on short term employment results over equipping students with career management skills for life.
This is a key criticism of DLHE, is it counting what measures of measuring what counts? DLHE is a large census, universities need to attempt to get an 80% return of their undergraduate home and EU students to have a valid survey. To this end some of the measures are a touch blunt, focussing on employment status, if a destination (work or further study) is graduate level (a category that is answered yes no based on an analysis of job title and job duties as reported by the returnee) and salary. These are the key factors that are then reported to form the government’s KIS data and to help inform league tables.
Criticisms of this are obvious. The “yes”, “no” nature of the graduate job category crushes a sense of nuance, six months seems like an artificial period, where is the longitudinal data? Why are measures like salary and job title used to describe a destination as positive without asking for the student’s own view of the data?
All of this said I feel that there are factors that help paint a broader picture of student experience and contribute to a more nuanced picture of how well students are supported on various courses. I want to pick up four which are worth looking into.
1) Post-graduate study versus employment. If you are asking how well prepared for work students are then paying attention to how many students are finding work as opposed to going on to further study is really important. How employable is the new degree students are moving on to? There may be a world of difference between an academic degree, a professional qualification in a competitive area (e.g. the LPC) or a professional qualification with a higher link to an immediate job (e.g. PGCE). This is to say more nuance may be needed when thinking about further study as a destination.
Why are students taking jobs? DLHE has a couple of more subjective measures for students that are often not reported. Students are given a range of options to choose between as to why they took their job ranging from statements like “because it was the first job I was offered” and “needed money to support myself” through to “looking to build worthwhile experience” and “is part of my career plan.” This sort of more subjective data though not perfect would still be interesting to begin to look at what the student’s own view of their destination might be.
What sector are they going in to? Especially for prospective students it may be interesting to see if your view of where a degree might lead matches up with reality. Does Law lead to jobs in the world of business? Are humanities being hired to become graduate managers and consultants? Could an education studies degree open up a career in HR? Looking at the sector students end up in can say a lot about the sort of support and direction student’s get. I feel that quality support especially on more traditional academic subjects (think Maths, English Lit, Geography) will lead to a range of destinations, students won’t be funneled towards one sort of area so looking at the breadth of sector students end up in may be useful in this manner.
What is their view of how well university prepared them? This for me is criminally under used measure. Students look at their support under three main areas (employment, further study and enterprise) and are asked on a four point scale how well prepared for these destinations they were. Students tend to be vaguely positive on this measure but looking at the numbers at the extremes could still prove useful analysis.
My frustration is that universities and league tables tend to stay away from these measures and in general use a very narrow focus of the information created by the DLHE survey. Using the existing tool there can be more nuanced ways of reporting on student destinations especially with greater use of the more subjective student data.
All this said there are clear drawbacks to DLHE. A lot of these are unavoidable. I feel it is a good thing to try and measure quality of careers provision at HE and looking at destinations has to be part of that picture. That said DLHE is in a rock and a hard place on a number of issues. More longitudinal data would be useful but the further you get away from graduating the harder it is to prove how much of a factor the graduates course was in their career progress? How much credit can my undergraduate history department take for my career progress nearly 10 years on? More subjective data would also be of use but DLHE is a census, you are aiming to capture 80% of the potential survey population, the more complex you make it the harder it is to persuade graduates to complete it.
My feeling is that if you want to assess how well universities are preparing students for their futures you have to qualitatively assess input as well. This is about putting a greater level of focus on the thing that universities can control, what they offer to students while at university, rather than focussing on what students do away from the support. Some form of quality award or inspection would also create a vehicle for policy makers to challenge universities to think about equipping students with skills for life rather than skills for six months after leaving.
In short DLHE isn’t perfect but is a pretty good tool for the job, more of the range of data collected could be used especially in constructing league tables and there is a need to assess quality of input as well as output.