Using Data to Improve Student Retention: More Questions than Answers?

The “Issues around Using Data to Improve Student Retention” session at the JISC CETIS Conference 2012 examined some of the issues around using data to predict students at risk of failure and provide some examples of possible solutions to ensure that the institution does not fail such students.

The session was divided into five mini-sessions run by some of the JISC Relationship Management Programme projects. Here is an overview of some of the solutions being trialled by the projects and the challenges they face.

Loughborough University’s Pedestal for Progression project is using Co-Tutor to collect attendance data. Student attendance at lectures and personal tutor meetings is recorded, so it’s possible to see at a glance whether a student is turning up or if they dislike Monday mornings. The system gives automatic flags and provides a personal tutor with a red, amber or green visual of attendance (traffic lighting). As a result of introducing registers, attendance has gone up from 65.54% in 2004/5 to 70.26% in 2010/11.

Roehampton University’s fulCRM project is being run in parallel with attendance data recording within the Psychology Department. Fingerprint readers have been installed in all lecture theatres to record student attendance. Although there have been debates by staff about the “big brother” aspects of this approach, it would appear that students aree less worried as many are already used to doing this in school and in any case, fingerprint attendance seems to make students think that it (and they) really count and that someone is taking an interest in them. As the Department is fairly small, data was being kept on an Excel spreadsheet and a traffic light system used. The fulCRM project is now in the process of pulling automated data feeds from the attendance monitoring system into a student performance module.

Using a traffic light system, 18 students were identified as “red” just after the first semester (mostly male or part-time students with personal problems, or who had to travel long distances, etc) so extra support was put in for these students. In 2010, 8.3% of at risk students terminated and 19.8% didn’t have enough credits to go to Year 2. In 2011, since the traffic lighting system was put in place, this was reduced to 5.2% of at risk students terminating and 9.34% without enough credits to go on to Year 2. Of the 19 students with red flags, 16 have now gone on to Year 2 following retakes and extra support (only 3 actually left). In the past, a lot of students would have just drifted off course.

During the course of their project, the Southampton Student Dashboard at the University of Southampton project team has met with resistance from data owners, service providers, ethics authorities and faculty administrators, who cite a number of reasons why various types of data (from photos to student grades) can’t be used. They currently have a simple dashboard that shows picture of tutees, directory info, whether coursework has been handed in, and attendance. This particularly helps staff who need to let the Borders Agency know that foreign students have been seen and have attended classes. However, most information isn’t available in the rest of the University because of data protection, which is seen as protecting staff from doing any work or taking any risks.

The University of Derby’s SETL (Student Engagement Traffic Lighting) project is also using a traffic lighting approach and has been looking at some of the softer areas off engagement analytics. The team has produced a dartboard diagram showing the primary (e.g. attendance monitoring), secondary (e.g. sickness) and tertiary (e.g. student complaints) indicators of risk. They have also produced a “Withdrawal Calendar” to ascertain whether there are any key dates for withdrawal – there are. At Derby, a Progress Board meets twice a year to decide whether students are meeting the academic requirements and can progress on their course. These key times when students are likely to withdraw.

The ESCAPES (Enhancing Student Centred Administration for Placement ExperienceS) project at the University of Nottingham is exploring how an ePortfolio can enhance student engagement whilst students are on placement. Previous project work has already shown that both students on placement and staff found that ePortfolio tools, such as Mahara, have helped them to keep in touch with each other. Students have also valued the feedback aspect as it helps them to feel more motivated.

Some of the challenges, questions, and issues that have arisen during these projects can be grouped into the following areas:

Technical

  • Using the relationship management systems, such as Co-Tutor, may result in a bigger workload for staff, which may then manifest as a lack of engagement with the system.
  • There may not be any integration with other systems, such as e-mail, so staff have to copy and paste any e-mail from students into the system, or it may be difficult to exchange data between systems.
  • Not all data may be captured electronically but may still be paper-based.
  • It can be expensive to roll out pilot data monitoring solutions across the whole institution.

Human Aspects

  • Where interactions are recorded, such as personal tutor meetings, staff may feel that they have less freedom about where that interaction takes place, i.e. it may now have to take place in the tutor’s office rather than a neutral space such as a café.
  • Institutions need to make it clear to students that such systems are there to help them.
  • Should students be forced to attend classes?
  • The behaviour of departments who hold such data can be difficult. Data owners may be distributed. People hold their data very close to their chests and don’t want to share. Perhaps the most difficult bit is managing the soft human interface.

Data Privacy/Ethics

  • Who should be allowed to see staff data?
  • Who monitors the monitors?
  • Can use of data in this way be seen as “Big Brotherish”?
  • How quickly should staff intervene if a student is “red-flagged”?
  • Is it ethical to pre-load a system with at-risk demographics, e.g. part-time, male students?
  • What directory information should be available to everyone on a university intranet? Should photos be included (it can be useful for staff to add a name to a face)? What information shouldn’t be included?
  • Should grade history be confidential?
  • Students already share their data informally on Facebook, so why should the institution get involved?
  • Is it OK to use student data for research?
  • Who are the stakeholders involved in holding the data? What about contracts with work based learning students?
  • Sensitive handling of data and how it’s being collected is necessary.
  • Who should access what data? Should whoever needs the data be able to access it? What is legitimate? Data access needs to be considered before any opt-in box is checked. Explicit permission is needed from the student. Should anyone have access to depersonalised data as long as it’s properly depersonalised?
  • What about control of access for student? Student might be overwhelmed with information if everyone has access to the student’s data.
  • What about access by parents or other interested parties?
  • What is the “right data” to collect and analyse? For example, a student might have patchy attendance but may interact very well with the tutor and have a good assessment. Whoever acts on the data provided needs to know the student well.
  • What are the softer engagement analytics? What aspects of the student lifecycle can be harnessed? How do we capture this soft data? It needs to be about the social life as well as the academic life.
  • How early do we need the data and what would those data sources be? The data might come too late, i.e. the predisposing factors may already be in place before the student starts.

The session brought a lot of issues to light, along with some real privacy/ethical concerns, but it also highlighted that even a small change (such as keeping a register) can make a difference to student retention. However, we need to remember that students are not just collections of bits and bytes to be analysed and examined for trends. We are human with all the foibles, idiosyncrasies and circumstances that make us unique.