
News emerged last week that T-level examinations turned out to be ‘unreliable’ this summer. One assessment organisation, the Northern Council for Further Education (NCFE) has been plagued by gremlins to such an extent that there had been rumblings and complaints ever since the students received their results. The results have serious implications for all examinations and how assessments are reached. Also, the inevitable cracks in the T-level qualification are now emerging. The problems are mounting and eating into the confidence of students and employers alike.
A review of T-levels by the regulator Ofsted was released on Monday 24th October as, ‘A review of the quality of T-level courses: interim report’. It pulls few punches and is a catalogue of the failings that have emerged. It appears that, “not all learners felt prepared for how much work they had to do for a T-level course”. There were problems with recruitment and retaining staff and, “many did not receive comprehensive training and found teaching the new curriculum challenging”. The list continued citing problems with finding placements and delays in placements with employers. It seems for many that a key component of the courses was sometimes missing and, “some providers did not have access to resources such as textbooks and practice exam paper”.
These are serious and fundamental failings that were spotted early on and were bound to emerge in the examination results.
The gremlins strike early on.
The title paraphrases that of a British TV series based upon short stories by Roald Dahl that ran from 1979 to 1988. As expected, ‘Tales of the unexpected’, always had a twist at the end. This popular genre of TV drama was already well established through ‘The Twilight Zone’ series, that placed people in unexpected situations, and aired from 1969 to 1974. Both are apt metaphors for the current mess in which the organisers of T-levels find themselves. The 1963 Twilight Zone episode ‘Nightmare at 20,000 Feet’ is a good example of where most people fail to see the danger and those that do are labelled as insane. Passenger William Shatner is the only one who sees gremlins on the wing of the plane trying to destroy the engines in a storm at night. Although sane, he was removed on landing as mad. But we also see the ground crew trying to work out how the engines were so badly damaged.
Likewise, some observers foresaw the likelihood of ‘gremlins’ when bringing T-levels onto the scene with the aftershocks of the pandemic lockdowns still rumbling. The aim of T-levels was to provide an exclusive alternative to A-levels in post-sixteen education. This was intended to replace popular BTecs that could be taken in combination with A-levels.
Off the rails after one year.
T-levels are being gradually rolled out. The first three T-levels were started in September 2020, with a further seven introduced in September 2021. Included in the 2021 cohort were ones in health, healthcare science, and science. It is these that appear to have attracted the gremlins after their first-year examinations. These are run by the Northern Council for Further Education (NCFE) who won the contract as the awarding organisation. It cannot be overstressed that these subjects are ‘life-critical’ for many areas of our health provision and rigorous and reliable assessments of students are essential.
Questions asked.
T-levels were always going to attract interest in Parliament and questions by Labour MP Toby Perkins are prime examples. The response to one last Monday by Skills Minister, Andrea Jenkyns, was astounding.
When asked, “What proportion of the Year 1 (a) Health, (b) Healthcare Science and (c) Science T Level exam papers were retrospectively removed from the paper before those papers were regraded further to Ofqual’s announcement on 8 September 2022”
Her response was,
“Ofqual conducted a thorough review of the full range of questions in the Core examination papers for Health, Healthcare Science and Science T Levels and concluded that the assessments did not secure a sufficiently valid or reliable measure of student performance.
The department wrote to providers who started teaching T Levels in Health, Healthcare Science and Science in September 2021. The letter confirmed that in light of Ofqual’s findings, students’ grades for the core could be revised to be based entirely on their employer set project grade. To ensure that students were not unfairly penalised by this decision, any students who secured a higher grade in their overall core component than their employer set project were able to carry forward the higher mark”.
The NCFE acknowledged the problem and the solution with their own statement,
“For those students who sat core examinations this summer, the core grades taken forward into the T Level certificate can be the higher of the Employer Set Project grade or the overall core grade already received. Students will also be able to resit both or either the ESP and core assessments in the autumn to improve their grades. No student will be disadvantaged by this, and the highest core component grade will stand”.
This is an astounding admission that severely dents the confidence of teachers and students alike. Moreover, it should set alarm bells ringing for those employing students down the line.
Reliability at stake.
The issue of reliability of examinations per se has attracted much recent attention and the problem will not easily go away. This is established for both GCSE and A-level examinations and opens up to considerable doubt about the idea of relying on a single examination at the end of a course. Dennis Sherwood, author of the recent text ‘Missing the Mark’ (Canbury Press 2022), concluded that around one in four grades are unlikely to be correct and could be out by +/- one grade. This is something most students do not know and there is a misplaced and tacit acceptance that all is fair and reliable.
TEFS reviewed the situation in August with, ‘Exam results: Missing the mark and shifting the target’. The revelation should come as no great surprise since the regulator in England, Ofqual, readily admits this to be the case. It brings the whole process into the light and cannot be ignored.
Now T-level examinations have turned out to be unreliable, with the acceptance that they seriously underestimated the potential of students. Unlike for their distant cousins, A-levels, the examiners had the option of resorting to other assessments. Although not clear, it seems students were unhappy about exam questions that did not appear to be related to the curriculum taught. They were ill prepared. This is the one thing that will get a fierce response from all students at whatever level they are.
How many students and what was involved?
Over one thousand (1,241) students enrolled on the first T Levels in September 2020, with 1,029 achieving a T-level qualification this summer. The remainder appear to have dropped out. Many did their work placements remotely. There is no doubt the pandemic lockdowns had an adverse effect.
The 2021 cohort have completed one year and have a year to go before achieving their results. But surely, they are now puzzled about the moves this summer. Full details have not been released but FE Week reported that around 1,600 students across 76 colleges and schools on the two-year course received their first-year results in August, although details have not been released.
Basically, too many of them got grades well below what they, or their teachers, might have expected. They were predicted to get between A* to C, but ended up with D, E and U grades.
Furthermore, mock examinations failed to prepare them for the outcome. The reasons for this seem opaque at the moment but are being investigated.
First reported by FE Week in August with, ’T level: Student outcry over first-year health and science results’, around 1,600 students across 76 colleges and schools on the two-year course picked up their first-year results in August, although they have not been released publicly. Many were disappointed.
A letter was issued by the Department for Education (DfE) to the College Principals who had in unison raised concerns. It seems the DfE relented in the light of serious complaints to demands for a rethink from both the colleges and students
By early September FE Week reported that Ofqual had confirmed the ‘Health and science T Level results will be regraded after watchdog finds ‘serious’ issues’.
“Ofqual have now completed their thorough review of the core assessment papers. This identified issues including question errors, inadequate mark schemes, and questions covering areas not explicitly in the specification”.
Then came the news on 12th of October that Ofqual had wider concerns with, ‘T Level exams issues more widespread than first thought, Ofqual reveals’. It seems more papers were under scrutiny and Ofqual released an enforceable ‘undertaking’. Yet Ofqual raised concerns back in June, but these did not stop the gremlins from taking over.
The structure of T-levels under question.
The idea of T-levels looks like a good option for students not destined to aim for university. It provides some employment experience and a technical skills route. They are designed with industrial sponsors in mind and there has been considerable involvement of employers in the T Level panels: membership. The issue is that they are destined to replace overlapping BTec subjects and become an all or nothing qualification. In contrast, BTecs can be taken alongside A-levels and offer more choice in leading to university entry. T-levels are less likely to offer the chance of a university place in most degree courses. However, they do offer UCAS points with the highest achievers getting 168 points for a distinction* (A* on the core and distinction in the occupational specialism) equivalent to AAA* at A-level. Indeed, many universities are listed as accepting T-levels, however this is for a limited number of subjects. The government updated today its list universities accepting a T-level as “suitable for entry onto a minimum of one course”. Of course, for most courses they are not suitable.
A social engineering experiment.
The result is a restructuring of the post-16 education system by ‘social engineering’ towards technical education and the workplace at eighteen. The initial plan was to limit entry into the T-level track to those achieving at least grade 4 in GCSE English and Mathematics. This would be highly likely to divert students capable of completing A-levels onto the technical track. TEFs discussed this in greater detail before the results emerged this summer in, ‘With exam results looming, the government is promoting T-levels as ‘Social Engineering’ (16th August 2022).
However, the Mathematics and English requirements have been downgraded. Instead, students must continue with these subjects and reach a level of competence. Updated in June 2022, the DfE states in ‘Introduction of T Levels’ that,
“Students will also be required to work towards the attainment of maths and English if they have not already achieved grade 4 at GCSE, as they do on other 16 to 19 programmes. However, T Level students are no longer required to achieve either a grade 4 in English and maths GCSE or level 2 in functional skills to pass their programme”.
This acknowledges that, for T-levels to succeed, there must be wider scope for students to enter the programme.
Eliminating error.
Introducing error into the examination process should be largely avoidable. Paula Goddard, Fellow of The Chartered Institute of Educational Assessors and Senior Examiner, has proposed a system where no error sampling would be required by, “changing the mindset of examining from quality control (dealing with errors after they have happened) to quality assurance (to getting it right first time)” (‘Quality Assurance in Examining’ January 2018). This would entail the simple logical steps of, “having hired the correct people to write and design the exam questions in the first place, having the correct admin procedures and expertly written exam specifications”. The aim would be to have “specifications that are so clear and unambiguous that clear exam questions can be written”. It appears the T-levels have fallen well short of this ideal.
Gathering a range of assessment data on students would tend to improve the reliability of the final result (See TEFS 22nd October 2022 ‘Attainment gaps and questioning the purpose of examinations’). But systemic problems with any element, such as the examination itself, cannot be corrected easily. This is particularly the case if the exam questions are unrelated to the curriculum. It is frankly a ‘rookie error’. Add to this the inherent unreliability of examinations and it’s a mess. The DfE and Ofqual have learned their lessons the hard way.
The author, Mike Larkin, retired from Queen’s University Belfast after 37 years teaching Microbiology, Biochemistry and Genetics.