Leaving Cert 2020 calculated grades: What lessons should be learned?

Newly revealed documents throw fresh light on effect of decision to omit school profiling

The memo was hastily written, but the sentiment clear: there was a danger some students would be “thrown under the bus” by late changes to the calculated grades process last year, according to a senior Department of Education official.

So, what were the changes to the system? Who was affected? And why does this matter now?

To explore these questions, we need to go back to the announcement last year to cancel Leaving Cert exams due to public health concerns.

How were calculated grades produced last year?
When the Government announced, on May 8th, the cancellation of last year's Leaving Cert exams, it unveiled a new process for estimating students' results: calculated grades.

READ MORE

A similar system is to be used for the 60,000-plus Leaving Cert students this year, although they are being called “accredited grades”.

At the time, the Government said four key sources of data would be used to process students’ estimated marks through a standardisation process:

– Estimated marks and rankings of students supplied by teachers;

– Junior cycle exam performance for students’ classes (to be used as a predictor for the collective performance of a class in 2020);

– Historical national distribution of students’ results on a subject-by-subject basis (this is used every year to ensure results are comparable);

– Historical school data based on past Leaving Cert exam performance in schools across three prior years.

From early on, however, the decision to include this final data set became a source of criticism.

The rationale for using school historical data was that the performance of students in a school does not vary widely from year to year. However, by mid-August last year, UK education systems were in turmoil after calculated grades were abandoned following the disproportionate downgrading of students’ grades, especially in disadvantaged schools.

Pressure was piling on the Irish Government to ensure students in disadvantaged – or Deis – schools would not be unfairly penalised.

The Government argued it had numerous safeguards built into its standardisation process to ensure no students would be disadvantaged.

However, new records show that even as it made these assurances, it was drawing up a revised version of the process in order to omit school historical data.

Why were grades standardised in the first place?
Standardisation is, simply, aimed at ensuring students are treated equitably and that grades are consistent from year to year.

This process takes place in normal Leaving Cert years to ensure that grades fit into a “bell curve” of results.

The challenge facing the Department of Education last year was how to fit teachers’ estimated grades into a fair distribution.

How were teachers' grades adjusted?
Teachers were very generous and significantly overestimated their students' performance compared with previous years' exam results.

Overall, the proportion of H1s awarded to students was two or even three times higher than in 2019.

“Such a change in standards within one calendar year is simply not credible,” reads one internal memo.

Under various standardisation models, it was estimated that about 40 per cent of grades would need to be pulled down to fit into line with the results of previous Leaving Cert years.

In the end, however, just 17 per cent of grades were lowered.

This led to grade inflation on a level that had not been seen before and a surge in CAO points last year.

So, why was 'school profiling' omitted?
Records show that in mid-August, officials were satisfied the Irish system had a range of safeguards. They argued that the standardisation process had a built-in facility to accredit the learning of "exceptional students" irrespective of the schools they attended.

However, as the UK controversy grew, the pressure to address the issue of school profiling gained traction.

What happened when this historical data was 'turned off'?
On August 13th, officials examined two schools to determine the impact of turning on and off school profiling on each: Mount Anville in Dublin – a high-achieving fee-paying school – and a Deis school in Co Kildare.

Using higher-level maths results as an example, they found the proportion of H1s awarded at Mount Anville dropped by 70 per cent compared with its historical track record when school profiling was turned off.

By contrast, the omission of school historical data led to a substantial increase in the mean marks and grade profile at the Deis school.

These calculations were based on earlier versions of the standardisation process, and so came with a health warning.

Soon, work began on producing two national models of results: one with school historical data minimised (known as “Model 18a”) and another with historical data omitted (“Model 18g”).

Were there concerns over the dropping of school profiling?
In notes on the changes attached to a document dated August 18th, a senior official noted some of the "negative consequences" of Model 18g.

The official noted that among the consequences were that “school effectiveness [was] absent” and that “[Junior Cert] prior attainment [was] more influential”; while students in some cases would be “outperforming historical plausibility”.

The official also questioned if the move would “undermine the credibility of the entire process – you could never do this again as no one would credibly engage in the process again”.

A final comment was that, by taking out the school effects, “the better-performing disadvantaged schools [would] also [be] thrown under the bus”.

So, what were the results of the two models?
On August 19th, Taoiseach Micheál Martin, Minister for Education Norma Foley and senior officials met to consider the two versions.

Officials said Model 18a – which included school historical data, albeit “ limited to the greatest possible degree” – resulted in non-Deis schools receiving proportionally more upgrades.

For example, in non-Deis schools, 72 per cent of teachers’ estimated grades remained the same; 23 per cent increased; and just 5 per cent decreased.

As for Deis schools, 69 per cent stayed the same, 8 per cent increased, while 22 per cent decreased.

Model 18g, on the other hand, removed entirely the use of school-by-school historical data. Under this model, there were proportionately fewer downgrades in Deis schools.

For example, in non-Deis schools, 77 per cent of teachers’ grades stayed the same, 4.5 per cent increased and 18 per cent decreased.

In Deis schools, 76 per cent stayed the same, 7 per cent increased and 16 per cent decreased.

Officials said opting for this model would allow the Government to counter accusations of a UK-style postcode lottery. The meeting ended with 18g being chosen.

So, who won and who lost?
When the results emerged, most students were happy with the record results.

However, some top-performing schools in the country – both private and non-fee-paying – felt they lost out, with their grades down on past years in many areas. Schools with a strong track record in individual subjects also felt they were unfairly graded.

However, many Deis schools reported record results and said more students than ever were progressing to higher education.

So, is the story of calculated grades a simple story of winners and losers? Was there “warped bias” against top-performing schools – as one private school principal put it – and a strategy to boost the achievement of underperforming schools?

The Department of Education has said the calculated grades process was “blind” and there was no evidence of bias towards any school type.

Education sources have acknowledged, however, there were issues in the standardisation process that did not necessarily pick up clusters of exceptional performance in subjects due to the way students' Junior Cert results were combined into composite scores; these, in turn, had an impact on how grades were awarded. The late decision to withdraw school profiling also seems to have played a role.

Why is this relevant in 2021?
Data on historical school performance in the Leaving Cert will not be used as part of this year's "accredited" grade process, according to the department.

A spokesman said the outcomes of these grades and the exams would have “regard to the pattern of Leaving Certificate results in 2020 and previously. Further details of the process will be determined, taking account of advice from the State Examinations Commission”.

However, assessments expert Prof Michael O’Leary of DCU Institute of Education says the use of school historical data would be useful in this year’s approach.

The fact that there will be Leaving Cert written exams and a calculated-grades process means it makes sense to use this data to “align” these two sets of grades.

“If school historical data is not considered, then aligning outcomes from the two systems being proposed for 2021 will be very challenging,” he says .

“And if the accredited grades are not standardised to be comparable to typical Leaving Cert grade distributions, and end up being as high as last year’s calculated grades, then further problems arise: additional college places are needed and CAO applicants from 2019 and before [will be] further disadvantaged.”