Supporting students as they start Year 7
St Bede's is a Catholic secondary academy in Lincolnshire, with 700 students aged 11 – 18. Formative assessment has always been central to their approach to teaching and learning, however with the removal of levels, they started to review their approach – and, in particular, how they assessed students when they joined the school.
“In the past, we relied on a combination of Key Stage 2 SATs data and teacher assessment, however we found big inconsistencies between the two,” explains Ryan Hibbard, St Bede’s Assistant Headteacher for Assessment and Curriculum. “That’s when we decided to introduce the Cognitive Abilities Test to provide an independent perspective.”
Having used CAT for a number of years, St Bede’s decided to adopt an even more comprehensive approach following the removal of levels, using a range of complementary standardised tests.
“We knew about CAT already but as soon as we heard what else was available, we knew it was going to be ideal for what we needed,” said Ryan.
“Where in previous years we had used Key Stage 3 SATs papers for English and Maths at the start of Year 7, they are all out of date now with the new curriculum. We also didn’t have any confidence in the levels they were giving us. We needed robust assessments that we could trust, and we wanted tests that could give a national benchmark for each student.”
St Bede’s decided to adopt the Transition Assessment Package from GL Assessment – a suite of assessments specifically designed to assess ability, attainment and barriers to learning.
The assessments include CAT4, the Progress Test Series in English, Maths and Science (levels 11T), the New Group Reading Test (NGRT) and the Pupil Attitudes to Self and School (PASS) attitudinal survey. The Progress Test Series contains specific levels (11T) which have been developed and standardised for use at the start of Year 7, providing schools with a robust starting point for target-setting.
All of the assessments within the Transition Assessment Package are delivered online; something that St Bede’s found beneficial.
“One of the main advantages of digital assessment is that it’s instant. The reports are available straight away – they’re there in the time it takes for me to walk from the computer room to my desk. That’s been really powerful – we have been able to act on information immediately.”
The school wanted to assess their new intake as soon as possible and decided to roll-out all of the assessments by the October half term.
“We booked out the computer room and started with CAT4, which took two hours. We then scheduled in the New Group Reading Test and used a PSE lesson for the PASS survey. Both of these took an hour each. In weeks three and four, we then scheduled the English, Maths and Science tests in the subject lessons. This meant that we had all the information we wanted really early on,” said Ryan.
What the data showed
At the beginning of the term, St Bede’s set students according to their SATs scores. However, as soon as the CAT4 data was available, they compared the KS2 SATs data with the Year 7 CAT4 results. The data from the other assessments was then added into the mix later on in the term.
A lot of St Bede’s teaching is done in mixed ability classes through its house system, and the senior team decided to provide each house with dedicated CAT4 reports. This enabled the house leaders to identify larger scale trends, as well as useful insights on individual students.
“The house reports were very interesting,” explained Ryan. “The CAT4 data showed us that while the students in one house had an average standardised score of 106, another’s average was 97. From this, it was obvious that some of our houses were far more able than the others. I’m pretty certain that if we hadn’t had that data, we wouldn’t have been able to tailor our teaching as effectively.
“Even within houses, though, there were marked differences in ability across different subjects,” he added. “One house had an average verbal score of 97 and an average quantitative score of 105. Those particular learners were mathematically very astute and this needed to be balanced with additional support in English.”
The other datasets proved useful, too. “One of the most insightful things was the NGRT results. We had 10 students with reading ages of above 17 years as well as five students with reading ages of less than 6.5 years. I don’t think we would have necessarily realised the scale of the difference without this information.”
St Bede’s was also keen to see what PASS showed them; the survey covers nine areas proven to be linked to key educational goals, such as attitudes to attendance and confidence in learning. It highlights any concerns via a traffic light system.
“Our students won’t be able to achieve their full potential if there are barriers in their way, and PASS was very useful in highlighting areas of difficulty.
“The results showed a sea of green for the majority of our students but there were 15 students with 5 or more amber or red alerts, and who we deemed to have a negative attitude. This information was shared with the house leaders and pastoral leaders, and while some students were known, others weren’t. It was very, very useful.”
Sharing data with staff
To make sure all staff would get the most out of the assessments, the senior management team ran dedicated inset sessions. This ensured everyone knew what had been assessed, what hadn’t been assessed, and how the data could help them in their teaching.
“We went through CAT in detail,” said Ryan. “It’s important for each department to understand the assessment battery that’s most important for them – the quantitative battery for the Maths department, for example. We also took the time to explain what the core assessment terminology means – not everyone would have been aware of standard age scores, stanines or percentile ranks so it was important to set aside time to explain it.”
Every member of staff was then given their own report, based on the groups they would be teaching. “You’ve got to give teachers the data that’s relevant to them and their class or department. We didn’t want to burden them with more than they needed,” said Ryan.
Once they had received their training, staff went into their own departments and their first priority was to use the data to reset their year groups – upwards and downwards.
“I want our students in the right place so they’re challenged correctly,” explained Ryan. “If we set our targets just from KS2 SATs and these weren’t reliable by themselves, our students would go through their entire time at St Bede’s failing and underachieving.
“We encouraged our staff to use a mixture of SATs, the standardised assessment data and their own judgements. As they didn’t have to set or mark the assessments, they were able to spend their time much more effectively, analysing the data and using it to support their lesson plans.”
Sharing data with parents
St Bede’s has always shared school data with parents and they decided to share these new datasets, too.
“One of the most useful things from the CAT4 reports are the retrospective KS2 indicators. If any child was more than a whole level below their SATs score, we contacted the parents to discuss the discrepancy face-to-face. If there were no issues, we sent them the dedicated CAT4 report for parents, which explained their results in a very practical way.
“It gave us a great opportunity to have a conversation with parents and discuss any concerns we had. Parents were then more understanding if we subsequently decided to make any set changes,” Ryan explained.
“Parents liked the fact that we knew so much at such an early stage. A lot of parents said things like, ‘We know an awful lot more now than we did before’. They appreciated seeing which subjects their child was likely to excel at and which they might struggle with. They found it all very impressive. It’s nice to start parent-teacher relationships like that.”