Using CAT4 and PASS data to identify potentially vulnerable learners

Stamford Senior School (part of Stamford Endowed Schools) is an independent school for girls and boys aged 11 to 18 based in Stamford, Lincolnshire and a user of GL Assessment’s Complete Digital Solution (CDS). Dr Andrew Crookell is Assistant Headteacher at the school with responsibility for Teaching and Learning. Andrew explains below how the school has made best use of data from the Cognitive Abilities Test (CAT4) and Pupil Attitudes to Self and School (PASS) to identify potentially vulnerable learners.

One of the biggest challenges schools face with assessment is having more data on their students than they can handle – even more so if they don’t know exactly what they can do with this data. Our starting point was to decide what it was we really wanted to find out about our students from the data. We therefore asked ourselves questions such as:

 

  • Which students are furthest from achieving in line with their potential?
  • Which students are exceeding expectations by the most?
  • Which students believe they are doing badly, but aren’t really (academic anxiety)?
  • Which students are underperforming but think they are doing fine?
  • How are specific groups of students doing relative to their peers (e.g. SEN, EAL)?
  • Are we getting the balance right between additional support for weaker students and stretching the more able students?
  • Can my staff tell the difference between ability and effort?
  • Are the students in this year group being stretched and challenged?

Key outcomes:

  1. 1

    Use the Standardised Age Scores and indicated grades provided by CAT4 to compare with student attainment grades and identify those achieving above/below average

  2. 2

    Administer PASS each term to obtain a measure of students’ self-perceived preparedness for learning and responses to curriculum demands

  3. 3

    Identify the students who require further investigation and use the data from CAT4 and PASS to discuss and review strategies for these students

To answer these questions, we used the students’ current attainment grades from teaching staff and ‘Approach to Learning’ (essentially effort) grades used to measure engagement with their learning. We compared this with the Standardised Age Scores and indicated grades provided by CAT4, as well as the student self-evaluation data provided by PASS.

Using this data in isolation without a deeper understanding of the context could potentially be misleading for the students’ outcomes. We therefore set up review meetings to discuss exactly what was happening with these students, what we were already doing to support them and what else we could do – which informed what we might be able to do next.

If we were not looking at the data in this way, some of the students who were quite able but underperforming might have been missed.

Dr Andrew Crookell, Assistant Head (Teaching & Learning), Stamford Senior School, Lincolnshire

Current attainment vs potential ability

Fig 1 is an example profile of a year group, comparing the Standardised Age Scores from CAT4 with their average attainment grades. At a simplistic level, this told us that the most potentially able students were the ones attaining the higher grades (which illustrated to us that our teaching programme was working to some extent).

We could identify the students (above the green line) who were achieving attainment grades significantly above average for their particular abilities and should be rewarded for this. Perhaps most importantly, we could also identify the students (falling below the red line) who, for their abilities, were significantly below the mean for this cohort. These were the students that we needed to have more detailed conversations about.

If we were not looking at the data in this way, some of the students who were quite able but underperforming might have been missed. There may not have been anything fundamentally wrong with their grades, and they may not have been the lowest in the year group, however we knew that the students were achieving considerably below where they ought to be.

Stamford Fig 1

Fig 1 - click to enlarge. 

Please note the student names on these visualisations have been anonymised

Understanding students’ engagement with learning

Administering PASS surveys with students across the school each term allows us to obtain a measure of each student’s self-perceived preparedness for learning and responses to curriculum demands. We also aggregate scores to consider students’ self-perceived academic competence and on contextual factors concerning aspects of a student’s learning.

To visualise the students’ PASS data, we plotted their results on a quadrant (see Fig 2) against the difference between their current attainment grades and their indicated grades provided by CAT4. If students were at zero on the quadrant they were meeting expectations and if they were above zero then they were exceeding expectations. With the PASS data we could then compare this with their self-perception, i.e. their engagement with their learning.

Ideally, we really wanted our students to be occupying the top-right corner of the quadrant – where they were perceived as engaging with their learning and exceeding expectations. In the top left, we had the students who were exceeding expectations but didn’t rate themselves as engaged with their learning – therefore we wanted to have conversations with those students and their teachers to find out why that might be the case (e.g. academic insecurity or pressure to be high-achieving).

In the bottom-right we had the students who were underperforming but rated themselves as engaged with their learning. These students would need a conversation to discuss what changes could be made to get where they needed to be. Then in the bottom-left we had the students that we really needed to be concerned about – they were struggling and knew that they were. We could therefore start putting interventions in place for these students such as academic mentoring.

Fig 2 - click to enlarge. 

Please note the student names on these visualisations have been anonymised

Measuring the impact

Last year, we had conversations about 240 students in our review meetings. For around 30% of these students we were able to identify strategies that hadn’t already been put in place so that we could try something new for them. By the end of the year, 42% of those 240 students were no longer a cause for concern. This provided us with evidence of the impact of this data visualisation, which wouldn’t have been achievable without the additional insights provided by CAT4 and PASS.

This provided us with evidence of the impact of this data visualisation, which wouldn’t have been achievable without the additional insights provided by CAT4 and PASS.

You may also be interested in...