Achievement of Curriculum for Excellence (CfE) Levels 2023-24 - Methodology
Details of methodology for the Scottish Government publication: Achievement of Curriculum for Excellence (CfE) levels: 2023-24.
5 Data collection, quality and timeliness
5.1 Data collection
ACEL data is collected from 32 local authorities, one grant-aided school and seven special schools.
The ACEL Census day falls near the end of the school year on the second Monday of June each year. Each local authority or school takes a data cut on the Census date. They perform their own quality assurance of the data in discussion with schools as necessary (detailed further in section 5.2 below).
After this period of internal quality assurance, data is submitted to the Scottish Government by the final Friday of August. It is submitted using the ProcXed system. In most cases the data comes directly from SEEMiS (an Education Management Information System).
The data specification, which outlines the data that should be provided and the format it should be provided in, can be found here: Scottish Exchange of Data: achievement of Curriculum for Excellence levels - gov.scot (www.gov.scot).
Before the data can be submitted a number of automated validation checks take place. These are outlined in section 3 of the specification and include, for example:
- checks that each pupil for whom data is being provided is recorded as being in one of the relevant stages for which ACEL data is collected (i.e. P1, P4, P7, S3),
- checks that a judgement has been provided for all of the organisers (English Reading, English Writing, English Listening & Talking, Numeracy, Gaelic Reading, Gaelic Writing, Gaelic Listening & Talking) for which one is expected etc.
These checks are referred to as ‘first stage validation’.
Upon receipt of the data from local authorities Scottish Government statisticians then perform further quality assurance checks throughout September. This is referred to as ‘second stage validation’. The data provided by each authority or school is run through a series of checks in a statistical analysis software package (currently SAS). These checks include:
- ensuring that data has been provided for every school in a local authority (unless there are no pupils in the relevant stages within that school),
- checking that data for each pupil has been provided only once,
- considering data observations that are not in line with usual patterns,
- comparing the number of pupils for whom data have been provided to the number of pupils recorded as being in that school and stage in the Scottish Government’s pupil census collection for that academic year,
- identifying cases where a pupil has been recorded as ‘not assessed’ (often used where a pupil has recently arrived at a school and so a teacher has not had enough time to make a judgement) but where the pupil was recorded as being at the same school at the time of the previous September’s pupil census,
- comparing the proportion of pupils in a given school and stage who achieved the expected level against the figures provided for that same school and stage the previous year (recognising that this will be a different group of pupils and that the figures can legitimately change between years).
This validation includes identification of clear errors (such as more than one record being provided for the same pupil) with other cases which may not necessarily be errors but where Scottish Government statisticians believe local authorities may wish to double-check the submitted data.
While in some cases local authorities do not make any changes to their data at this stage, most do. Typically the number of changes made is relatively small compared to the total quantity of data that is provided, representing less than one per cent of records.
Once local authorities and schools have completed their checks and made any necessary amendments to the data they re-submit it. Scottish Government statisticians then prepare summary information for each school in a local authority which is then sent to Directors of Education (or equivalent) for their final ‘sign-off’ of the data, usually by early to mid-October. It is unusual for any changes to be made to the data at this stage.
Once the data has been signed off for all local authorities it is then be analysed by statisticians in order to produce this document and the supplementary tables – which are published on the second Tuesday in December.
5.2 Data quality processes undertaken by local authorities
The data which is used in the ACEL publication undergoes rigorous quality assurance processes.
The quality assurance of the data itself is outlined in the ‘Data collection’ section above. However, prior to the data being submitted to the Scottish Government significant work is undertaken by teachers and schools to ensure that the judgements they are making about the level at which pupils are performing are accurate and consistent with expected standards.
The expected standards under CfE are embedded in the Experiences and Outcomes. These are a set of statements about children’s learning and progression in each curriculum area which are used to help plan learning and to assess progress. Further to this Education Scotland have published Curriculum for Excellence Benchmarks for literacy and numeracy to provide clarity on the national standards expected within each curriculum area at each level. They set out clear lines of progression in literacy and English and numeracy and mathematics in order to make clear what learners need to know and be able to do to progress through the levels, and to support consistency in teachers' and other practitioners' professional judgements.
Scottish National Standardised Assessments are also used to help inform teachers judgements.
Further to this, a national programme of Quality Assurance and Moderation has been put in place to provide more support and improve confidence and understanding amongst teachers.
Education Scotland continues to support training for current and new Quality Assurance and Moderation colleagues (QAMSOs). All local authorities have QAMSOs who support teaching colleagues in the moderation of work against CfE Levels. More detail on the role of the QAMSO and the support they offer to schools is available in the linked factsheet published by Education Scotland.
The aims of the QAMSO programme are to support local authorities and schools to develop a better understanding of standards in literacy and numeracy, to support effective assessment and moderation, including the development of high quality, holistic assessments, to share good practice, and to enable local authorities and schools to have greater confidence in the validity and reliability of teacher professional judgement.
Teachers undertake regular moderation exercises to ensure the CfE levels provided for children and young people are accurate and the ACEL data is robust. More generally, a wide range of both local and national moderation activity takes place and further information on this is provided below.
In order to better understand how the submitted ACEL data has been determined and any quality assurance processes or issues that take place before the data is submitted, the Scottish Government oversees an annual qualitative collection of information from each local authority. This seeks information on:
- The types of evidence used by teachers to support the judgements being made (see section on Evidence below)
- The nature of moderation exercises undertaken by teachers and schools (Moderation)
- Quality assurance processes used by local authorities to validate and approve their data (Quality Assurance)
- Any data quality issues or concerns (Data quality)
For the 2023-24 data collection, returns were received for all 32 local authorities. A summary of the responses is provided below.
5.2.1 Evidence
Local Authorities reported a wide variety of evidence being used by teachers to reach their professional judgements. This included classwork, observation of pupils in class, discussions with children, focus groups, reviewing pupil portfolios and learning logs, written, digital and photographic evidence, formative and summative assessments and Scottish National Standardised Assessments (SNSA) Four out of five local authorities referred to using SNSAs as part of their evidence. A small proportion of local authorities reported using commercially produced assessment tools as part of their evidence gathering. Several authorities said they referred to the national benchmarks when making judgements.
5.2.2 Moderation
Every local authority reported the availability of and/or use of moderation techniques and support. In some cases the evidence provided stated that moderation support was available to all schools but it was not clear the extent to which this was actually used at a school level. In others the evidence suggested significant moderation activity was taking place with some referring to mandatory events.
Moderation activity took place at a variety of different levels, often depending on the circumstances of that school (e.g. its size). Types of moderation reported included: internal moderation, peer moderation, whole school moderation, cluster group moderation, LA wide moderation, associated school groups (ASG) moderation and trio moderation. Most LAs referred to multiple types of moderation activity running concurrently with the most commonly mentioned being cluster group moderation.
The moderation activity itself involved collaborative working within these groups between teachers. It included, for example, professional dialogue around planning and sharing of standards, staff teams working together to draw on guidance and exemplification, sampling writing across schools in a local authority, teachers reviewing assessment judgements within professional dialogue. Schools also received training opportunities from central officers. Curriculum Advisory Groups and Subject Network Groups were used to focus on authority wide moderation activities.
The use of national benchmarking information was cited in one third of returns and several local authorities (28 per cent) referenced making use of Quality and Moderation Support Officers (QAMSOs) and Education Scotland Attainment Advisers to support this work.
5.2.3 Quality Assurance
A range of methods were reported by authorities used to quality assure the data. These included:
- providing standardised data to schools and discussing this with them;
- use of progress and achievement tracking information in SEEMIS to track progress of pupils throughout the school year;
- using summary tables available via ProcXed to check summary data against that recorded locally;
- regular meetings and discussions between central education teams and headteachers/leadership teams;
- Quality improvement officers played a role in half of local authorities (53% reported this in their returns but the actual proportion may have been higher);
- local progress and achievement data collections and analysis throughout the session and comparison between this and final judgements;
- comparisons between predictions and submitted judgements
- checks that all eligible pupils have a teacher judgement;
- investigations of cases where pupils were recorded as ‘Not yet assessed’ or as ‘Following individual milestones’;
- quality assurance of a sample of moderated work from establishments across the local authority;
- comparisons against historical data to identify anomalies in trends;
- replication of the quality assurance process subsequently performed by Scottish Government analysts
5.2.4 Data Quality
The majority (69%) of local authorities stated that they were confident in the teacher judgements that had been submitted. It is important to note that of the remaining local authorities, most did not indicate concerns about the quality of their data, they simply did not explicitly state that they were confident in their data.
There was generally more confidence in the teacher judgements for Primary school stages than for S3 judgements with around one in ten local authorities citing concerns about comparability and robustness of data at S3 level.
A small number of authorities indicated that while they were generally happy with the quality of the data that had been submitted they continued to have concerns about a small number of schools or about judgements at a particular level. The likely impact on the national data is minor.
Increased training and moderation opportunities were highlighted by some local authorities as a means to improve on the quality of the data. In most cases, these plans were already in place.
5.3 Timeliness
ACEL data relate to the second Monday of June (i.e. close to the end of the academic year) and are published around six months later on the second Tuesday of December (i.e. around four months into the following academic year).
During the first two and a half months of this period schools and local authorities collate the data and perform internal validation – the Scottish Government does not have the data at this point.
The data is submitted to the Scottish Government on the last Friday in August. Scottish Government statisticians then work with local authorities to perform further quality assurance (second stage validation) over the next month and a half (throughout September and the beginning of October).
The Scottish Government has a fully quality assured dataset by early to mid-October and the following two months are used to process and analyse the data and to prepare material for publication on the second Tuesday in December.
5.4 Data limitations
Evidence from data suppliers along with analysis by Scottish Government statisticians indicates that the quality of ACEL data has improved since it was first collected in 2016-17. The availability of national guidance documents, Scottish National Standardised Assessments and a National programme of Quality Assurance and Moderation has supported this.
Teachers have become more familiar with the process and have grown in confidence in making judgements. This fed into the decision to remove the ’Experimental Statistics’ label from the 2018-19 publication.
Whilst it is acknowledged that judgements can be subjective, a wide range of supporting guidance, moderation activity and quality assurance checks are in place to ensure the teacher judgement data are consistent and reliable.
5.5 Developments in the ACEL data collection and publication since its introduction
This publication, and the associated supplementary tables, provide comparisons back to 2016/17, at a national and local authority level. When making such comparisons, it should be noted that both analysis of the data, and evidence provided to us by local authorities, suggests the robustness and consistency of the data has changed during this period.
2015-16 – 1st year of data collection
Analysis of 2015-16 data, alongside the 2016-17 data, highlighted inconsistencies between the two years. Due to this we do not recommend comparing 2015-16 data with data for subsequent years and therefore 2015-16 data is not presented in this publication or the associated supplementary tables.
2016-17 and 2017-18 – Experimental Statistics
These were new statistics in development, published to involve users and stakeholders in their development and build in quality and understanding at an early stage. The robustness and consistency of these statistics increased over time. This should be kept in mind when making comparisons between years.
2018-19 – Official Statistics
From 2018-19 it was decided that ACEL statistics would no longer be labelled as Experimental Statistics. The factors that led to the removal of the experimental label can be found in a paper available here. The robustness and consistency of these statistics increased over time. This should be kept in mind when making comparisons between years.
2019-20 – Data collection was cancelled
The ACEL collection and publication was cancelled in 2019-20 due to the difficulties in collecting data whilst schools were closed due to COVID-19.
2020-21 – Primary pupils data collection only
The 2020-21 ACEL publication covers Primary school children (P1, P4 and P7) only. Secondary school and special school data was not collected due to other pressures on these schools including implementation of the SQA National Qualifications Alternative Certification Model which was used to award National 5s, Highers and Advanced Highers in 2021.
The time period covered by the 2020-21 statistics means that the results will be affected by the coronavirus (COVID-19) pandemic. This should be kept in mind when making comparisons between years.
2021-22 to date – complete data collections
The ACEL publications cover Primary school children (P1, P4 and P7), Secondary 3 pupils and all pupils based in special schools/units.
5.6 Comparing between local authorities
If making comparisons between local authorities we recommend keeping in mind the context of the authorities and their approach to assessment.
In particular, in some local authorities, pupils with complex needs are integrated into their mainstream schools; these pupils have been included throughout this publication. However, within other local authorities school pupils with complex needs may attend a special school or standalone special unit. See Section 4.3 for more information.
5.7 School level data
School level results are also being released alongside this publication. They are available in the School Information Dashboard. Data will be published for all publicly funded primary and secondary schools subject to data protection limitations.
All school level results will be presented in ten per cent bandings (i.e. under 10 per cent, 10 per cent – under 20 per cent, … , 90 per cent or more). To prevent potential disclosure of information relating to individual pupils, any results relating to a grouping of 20 pupils or fewer will be suppressed. This means that around 19 per cent of primary schools and five per cent of secondary schools will have no information published for them.
As with the national and local authority level data, school results include ‘Pupil following individual milestones’. This may have a particularly large impact on schools with an integrated special unit.
Children who were recorded as ‘Not Assessed’ are not included in the calculations.
The data quality considerations described in Section 5 also apply to school level data. There is greater likelihood that an individual school’s results are affected by variations in assessment approach, socio-economic context and school size (for example) than is the case at the more aggregated local authority or Scotland level. If making comparisons between schools we recommend keeping in mind the context of the authorities and their approach to assessment.
There is a problem
Thanks for your feedback