Mime Mime Mime Mime
  • Home
  • About
    • Jobs
  • Sectors
    • Early years
    • School improvement
    • Skills & careers
    • SEND
  • Services
    • Data analysis and insights
    • Visualisations
    • Trackers
    • Systems development
    • Data management and strategy
    • Capacity building
    • Tableau consultancy
  • Products
    • Early years
      • EYFS Tracker
      • Children’s Centre Profile
      • DataPix
    • School improvement
      • Super School Profile – Primary
      • Super School Profile – Secondary
      • Post 16 Profile
      • DataPix
      • SEND analysis
      • SS Tracker
      • Applied Learning Tracker
    • Skills & careers
      • Post 16 Profile
      • Risk of NEET Analysis
      • Skills Route
      • DataPix
      • Applied Learning Tracker
    • SEND
      • SEND Dashboard
      • SS Tracker
      • SEND Analysis
  • Case studies
  • Inclusion Series
  • Blog
  • Contact
Mime Mime
  • Home
  • About
    • Jobs
  • Sectors
    • Early years
    • School improvement
    • Skills & careers
    • SEND
  • Services
    • Data analysis and insights
    • Visualisations
    • Trackers
    • Systems development
    • Data management and strategy
    • Capacity building
    • Tableau consultancy
  • Products
    • Early years
      • EYFS Tracker
      • Children’s Centre Profile
      • DataPix
    • School improvement
      • Super School Profile – Primary
      • Super School Profile – Secondary
      • Post 16 Profile
      • DataPix
      • SEND analysis
      • SS Tracker
      • Applied Learning Tracker
    • Skills & careers
      • Post 16 Profile
      • Risk of NEET Analysis
      • Skills Route
      • DataPix
      • Applied Learning Tracker
    • SEND
      • SEND Dashboard
      • SS Tracker
      • SEND Analysis
  • Case studies
  • Inclusion Series
  • Blog
  • Contact
Apr 29

GCSE and A Level Grades in 2020

  • 29th April 2020
  • Joe Miller
  • Data analysis and insights, Mime Think Pieces, Post 16 Profile, Super School Profile

In this blog we’ll outline how GCSE and A Level grading will work this year. We’ll discuss some potential pitfalls as well as benefits of this approach, and outline what data analysis schools, colleges, local authorities, and academy trusts may find valuable. This year, in spite of the lack of formal accountability measures, having access to understandable and accurate data will be as important as ever.

Read a summary of our 2020 secondary and post 16 school analysis services here.

How will grades be calculated?

Simply put, grades will be calculated using judgements by teachers rather than exams (full Ofqual guidance here). Schools and colleges are being asked to judge, based on all available evidence, what grade each pupil would have achieved in each subject if they had been able to sit exams and submit all coursework.

Teachers are then asked to work with colleagues to rank all pupils within each grade in each subject. For example, if 15 pupils in one school are graded to get a 5 in History, those 15 should be ranked from highest to lowest. These assessments will then be submitted to exam boards, with a deadline no earlier than May 29th.

Exam boards, using a model currently being consulted on by Ofqual, will moderate grades to ensure consistency across schools, and fairness compared to previous years. Although this model is yet to be defined, it is likely to take into account expected national outcomes, pupil prior attainment, and historic performance of the school/college (although Ofqual’s own analysis shows that a centre’s prior year performance can be a poor predictor of results).

While the numbers at each grade per subject at a school or college may be moderated up or down, the rank order of pupils will not be changed. The ranks of pupils within each subject made by schools and colleges will be final. Neither exam boards, nor Ofqual will adjust at these rankings.

GCSE and A Level results days will then go ahead as planned on Thursday 20 August and Thursday 13 August respectively.

Benefits and drawbacks of teacher assessments

Just as exams can be an imperfect judge of a pupil, so can these assessments. However, while this system has been brought in because it is not possible for pupils to sit exams, there are reasons to think that teacher assessments may be a more reliable judge of a pupil’s level of performance than a one-off exam. As laid out by Ofqual’s guidance, these assessments will be based on the full range of ‘available evidence’, including bookwork, classwork, and mock exams. In fact, Ofqual’s literature review found some evidence that teachers’ estimates:

‘have potentially greater validity than formal tests’.

Predicting a pupil’s grade from such a range of evidence gathered over a long period of time may, for many, seem fairer than the results of a highly pressured one-off exam.

On the other hand, Ofqual’s review stresses that:

‘there is also a range of evidence that highlights issues of low reliability and potential bias in teacher assessments (reviewed in Harlen, 2005) relating to a range of student characteristics, including gender and special educational needs as well as ethnicity and age’

Concerns have been raised about the unconscious human bias involved in making teacher assessments. There may be some evidence that, at a cohort level, some groups of pupils could be assessed more harshly, or indeed more favourably. Specifically, there is concern that unconscious biases and stereotypes relating to gender, ethnicity, disadvantage and SEN status may influence assessments. Others have worried about how assessments will vary between state-funded and independent schools.

We believe that such concerns should act as an impetus for schools, colleges, local authorities, and academy trusts to use data analysis to ensure teacher assessments are reliable and robust. We know that using data to support decision making can help to remove such biases.

The role of data in making assessments

It is crucial that these teacher assessments are as fair and accurate as possible. As discussed, moderation will not change the rankings of pupils determined by the teacher assessments. Therefore, while moderation will try to ensure fairness at a school/college level and between regions/LAs, it will do nothing to ensure fairness and reliability of each school’s or college’s internal ranking of pupils. Getting this right is clearly important to the pupils themselves and is also vital for schools and colleges to ensure accurate benchmarking and reliable evidence with which to make school/college improvement decisions.

Schools, colleges, local authorities and academy trusts should use data and modelling to:

  1. Understand how their teacher assessments compare to prior year results, including their implications for headline measures such as Attainment 8 when aggregated to the whole school level
  2. Allow scrutiny of assessments that are very different to grades expected based on prior attainment and school’s historic performance in each subject
  3. Analyse teacher assessments by pupil group (e.g. disadvantaged status, ethnicity and gender) to explore potential unconscious biases

How can data help you produce, scrutinise and understand your assessments?

As ever, we’ll be working with our clients to provide them with detailed analysis of Key Stage 4 and Key Stage 5 attainment and progress. We have developed teacher assessment moderation and analysis tools for GCSEs and A levels to help schools with their assessments, and then to analyse the effect of these assessments on aggregate attainment and progress scores. This tool facilitates teacher assessments moderations by flagging up how assessments differ from:

  • the grades expected based on the pupil’s prior attainment
  • the grades expected when allowing for the school’s historical progress scores
  • the grades expected based on pupils’ scores in recent internal testing
  • the school/college prior year performance.

Our moderation analysis report then aggregates assessments up to provide in depth pupil group and subject analysis. The report allows schools and colleges to explore how scores for different groups of pupils or for each subject are different from prior year and/or from expected scores based on pupils’ prior attainment.

Alongside this, we will provide our on the day results service to help schools, colleges, local authorities and academy trusts understand and benchmark their results as soon as the final grades are issued. This will add valuable context by showing how awarding body moderation has affected their grades compared to other schools.

Further down the line, as data is shared by schools and the DfE we’ll be exploring how best to analyse and use this new data, alongside previous and future year’s data. We’ll work to identify trends and differences to previous years that might be explained by this year’s system of grading. This will help our clients understand how best to interpret this year’s data when making decisions now, and when looking back at trends.

Summary

The changes to the GCSE and A Level grading system this year will, in some ways, increase the need for schools and colleges to have reliable and accurate data. We will be working with schools, colleges, local authorities and academy trusts to provide them with this support. We have already developed tools to help schools and colleges moderate assessments and we will provide analysis to help them understand what these assessments mean at a cohort and subject level.

At the local authority and national level, Key Stage 4 and Key Stage 5 attainment and progress data published by the DfE this year will need to be well understood in the context of the grading system. Analysis of this data will therefore be hugely important.

We look forward to working together with our clients and partners to support accurate and fair teacher assessments, and enable robust decision making based on a clear understanding of this year’s attainment and progress data.

Click here for a full summary of the secondary and post 16 data analysis services we will be providing this summer.

If you are interested in discussing any of this analysis with us or have any other ideas about how we can support you then please do get in touch.

  • Twitter
  • Google+
  • LinkedIn
  • E-Mail

About The Author

Joe joined Mime in 2019 from the British Medical Association where he led on their analysis of NHS pressures data. Previously he worked at the Wellcome Trust, in their Education team working to improve science education. He studied Economics at Cambridge and has a masters in Public Policy. Volunteering as a tutor for two London-based education charities sparked his passion for using data to improve education outcomes.

Related Posts

  • DfE Data Release: Destinations of Disadvantaged Young People23rd February 2021
  • DfE Data Release: 2020 Key Stage 4 Attainment30th November 2020
  • DfE Data Release: SEND Incidence and Primary Needs26th August 2020
  • Report Launch: London’s Post-16 Trajectories8th July 2020

Comments are closed.

This blog presents company news as well as facts and ideas that we have encountered whilst working with a range of data across education, skills and employment.

Categories

Recent Posts

  • DfE Data Release: Destinations of Disadvantaged Young People
  • 2020 Round-up: Our year at Mime
  • DfE Data Release: 2020 Key Stage 4 Attainment
  • DfE Data Release: SEND Incidence and Primary Needs
  • Report Launch: London’s Post-16 Trajectories

ABOUT MIME

Experts in communicating complex information with clarity within the education sector. Passionate about using data to improve the life chances of young people.

SECTORS

  • Early years
  • School improvement
  • Skills & careers
  • SEND

RECENT POSTS

  • DfE Data Release: Destinations of Disadvantaged Young People
  • 2020 Round-up: Our year at Mime
  • DfE Data Release: 2020 Key Stage 4 Attainment

CONTACT

144 Clapham Manor Street, London, SW4 6BX Phone: 0208 099 4240 E-Mail: info@mimeconsulting.co.uk
Registered at Companies House: No. 6306298; VAT Registration No: 920256844 | Privacy Statement | Website by Fresh Pies