School Data – More, more, more? Part 1

September 5, 2019
By: Chris Beeden, Educational Data Consultant, School Data Managed Ltd

In this series of blogs, Chris Beeden from School Data Managed Ltd discusses the use of data in schools, from humble beginnings to today’s measures and systems.

Old School

Mark the work with ticks and crosses and the odd comment. At the bottom of the work write in a grade on the standard of the work produced.

 MIS History

 In the 80s in the depths of Central Bedfordshire a small group of teachers decided a computer could help them write school reports. They created a DOS based program to enter comment banks and, via keys presses, you could very quickly create a report. This developed into Student Teacher Academic Records and an Attendance and Exams package was added.  BBBD enabled an exam entry to be made in under 2 seconds. Form 7 could be completed from the database for funding purposes. Windows was invented and SIMS developed in the new format. When Form 7 changed to the School Census it gave the supplier a unique chance to corner the market as the more complicated return needed more development time and was (and still is) a barrier to competition. At this point the MIS was in the office and data entry was done by office staff, with an OMR sheet printed for teachers to do some of the data entry for reports, attendance and assessment.  The next step was to join the curriculum and administration networks and suddenly a school had 100 data entry clerks to complete the data.  SIMS brought out Assessment Manager 7 and school teachers became data entry clerks overnight.

Fairly Old School – What we did in the early 2000s

With the new MIS system and Excel improving weekly the data analysis world was progressing well as we survived the millennium bug. A test led to a score, which was converted to a grade (a current or an end of course forecast). Some ranked pupils to show position in year group.  We were able to collect them and used CATS, Yellis, FFT, ALis, ALPS, 2 LOP, 3 LOP or 4 LOP to benchmark outcomes. Some great analysis of progress was produced and we started to look closely at sub groups’ progress. Targets were calculated using one of the above and then agreed with the pupils. The systems were developing but data had its place.

More, more, more…

Once teaching staff became competent in completing registers and mark sheets data collection began to grow fairly quickly. “We expect 6 collections a year for an RI school” is something I have heard many times over the last decade (last heard from the DfE in past 6 months).

Whether it was a DfE advisor, Ofsted, LA advisor or MAT CEO there was a perception that if a school was good at collecting and analysing data then it was improving. Data could also be used as ‘comfort blanket’: ‘I know the pupils are progressing and here is my folder to prove it – look at this graph’. The World Wide Web was developing rapidly at the same time and the new anywhere, anytime access to data further lead to its growth. The recording of each individual task was made possible leading to hundreds of thousands of data items for one pupil – these can be done at school, on the bus, at home or at the pub.

Chris Beeden

Chris runs School Data Managed Limited, established three years ago. Previously, he was a school data manager for over a decade after working for Capita SIMS in a local support unit.  He is a timetabler, has a working knowledge of assessments from Early Years to Post 16; supports school census completion and is a qualified GDPR Practitioner and practicing DPO.

Chris tweets from @ChrisBeeden

Click here to read part 2 of Chris’ blog