Data Collaboration and the magic Subject Progress Index
December 7, 2018
By: Chris Hildrew, Headteacher, Churchill Academy & Sixth Form
Results days have changed dramatically in the last few years. I remember the days when the results would come in on that Wednesday in August and, with a few buttons pressed, you could see not only how well the students had done but where you stood as a school. How many of them had got five at C and above? How many of them had got five at C and above including English and Maths? And, if you care about such things, how many of them had got the English Baccalaureate?
I’m glad to see the back of the 5A*CEM measure, with all its perverse incentives and focus on arbitrary borderlines. For all its many, many, many flaws, Progress 8’s heart is in the right place – the progress of every individual young person from their own individual starting point “counts” towards the school’s performance. But, because the Progress 8 score is benchmarked against every other young person in the country, the school’s performance is separated from the individual’s performance by a month. It’s not until the Tables Checking opens that you can see whether the results that you thought were good, were actually good.
There are often big surprises in the new estimates – big swings in one bucket or another, unexpected variations – which mean that you can’t accurately rely on the summer 2017 estimates when making assessments about student progress towards the 2018 examinations. And, because Progress 8 is organised by buckets, you can’t disentangle an individual subject’s contribution to the overall score with any real clarity.
As a school leader, I wanted a simple answer to a simple question: is this student making sufficient progress in this subject? Thankfully, this year, SISRA have a solution. It’s called the Subject Progress Index (SPI).
The SPI relies on the data collaboration which sprang up out of the results day blackout. Because so many schools use SISRA Analytics, it was possible to anonymously “pool” students’ data (with schools’ consent, of course), to create an approximation of the 2018 Attainment 8 estimates. It wasn’t perfect – SISRA schools are not the same as “all schools” nationally, and only those schools who consented were included in the collaboration. But, the more schools opted in to the collaboration, the more robust the estimates became, and they ended up being a fairly reliable approximation of the actual DfE national estimates.
And then, the clever bit. Within the collaboration data, it was now possible to disentangle what the outcomes were from each KS2 starting point in each KS4 subject. Clearly more students took some subjects – English, Maths etc. – than others, which made their dataset more statistically reliable. But the Subject Progress Index gives school leaders a fair approximation of whether a student’s performance in an individual subject is similar to, better than, or not as good as students nationally who have taken that subject with similar starting points. This can be broken down by class, subgroup and individual students, so that it is possible to compare whether one group has made better progress than another.
And the SPI can also be applied to internal data collections too. That means that, for this year’s Year 11 mock exams, we can get a sense of whether students are performing in line with, above, or below their peers nationally with similar starting points in similar subjects. Admittedly, this is based on the 2018 exams and only uses SISRA Data Collaboration results, but I am comfortable that this gives us a close-enough approximation of progress to allow us to target interventions and differentiation where it is needed. And this isn’t just at the arbitrary strong pass or standard pass borderline, but at every student at every attainment level who could be making more progress, or doing that little bit better.
Progress 8 was designed to compare schools, to rank them, to show which schools were better than others, and which were worse. What makes the SPI special, I think, is that this measure, designed to promote competition between schools, has instead spawned a spirit of collaboration and cooperation which works in the best interests of students and their teachers and leaders.