On May 9, 2025, the Joint Admissions and Matriculation Board (JAMB) released the results of the 2025 Unified Tertiary Matriculation Examination (UTME), unleashing a torrent of dismay across Nigeria.
Over 1.5 million candidates scored below 200 out of 400 marks, a shocking departure from historical performance trends. The widespread outrage prompted Alex Onyia, CEO of Educare, to spearhead an investigation into the results’ integrity.
On May 14, Onyia tweeted, “I promised I will share a detailed technical report of what happened after our review with JAMB core system. Carefully read it please…”.
The accompanying JAMB 2025 UTME Technical Review Report, authored by Engr. James Nnanyelugo of the Educare Technical Team, revealed a critical human error that compromised the results of nearly 388,000 candidates, triggering JAMB’s swift remedial action.

The controversy erupted after candidates and parents questioned the validity of the low scores and attributed them to potential technical failures during the computer-based test (CBT).
The public outcry necessitated an urgent review, leading to a high-level technical session on May 14 at JAMB’s headquarters in Abuja. Presided over by Registrar Professor Ishaq Olanrewaju Oloyede, the meeting aimed to “unravel the root causes behind the unexpectedly poor candidate performance and to establish clear mitigative measures to restore confidence in the integrity of the UTME assessment process,” as stated in the Educare report.
The review panel comprised JAMB directorate heads, lead systems analysts, CBT Centre Regulatory Committee delegates, Educare Technical Team representatives, and engineers from the software vendors managing the CBT infrastructure.
The session meticulously dissected the examination’s technical framework, focusing on the software stack, question delivery mechanisms, randomisation protocols, scoring logic, quality assurance processes, and potential human influences. The report noted, “Discussions commenced with a comprehensive analysis of the existing system architecture,” highlighting the panel’s thorough approach.
A pivotal finding was the introduction of three systemic changes in the 2025 UTME.
First, JAMB transitioned from a count-based to a source-based analysis of results, evaluating the logic and origin of answers rather than merely their quantity. Second, full-scale shuffling of questions and answer options was implemented to enhance test security, ensuring no two candidates in the same session received identical permutations. Third, systemic optimisations reduced lag, contributing to the highest UTME score in 15 years. The report underscored, “This was a major policy change that saw the best and highest-obtained UTME score in 15 years. And this would have amounted to a great achievement by JAMB!”
However, these advancements were undermined by a critical operational failure. The report revealed, “The system patch necessary to support both shuffling and source-based validation had been fully deployed on the server cluster supporting the KAD (Kaduna) zone, but it was not applied to the LAG (Lagos) cluster, which services centres in Lagos and the South-East.”
This oversight persisted until the 17th session, affecting 157 centres—92 in the South-East and 65 in Lagos—and approximately 379,997 candidates. The unpatched LAG servers used outdated logic, causing mismatches during answer validation that skewed results.
To substantiate these findings, the Educare Technical Team analysed response data from over 18,000 candidates, yielding 15,000 authentic records after deduplication. The report confirmed:
“Of these, more than 14,000 originated from the regions serviced by the unpatched LAG servers,” aligning with JAMB’s internal audits. This overlap validated the technical review’s conclusions, pinpointing the error’s scope and impact.


The report emphasised that the issue stemmed from human error, not system failure or deliberate manipulation, stating,
This incident was neither a system failure nor administrative manipulation but an outright human error.
JAMB 2025 UTME Technical Review Report, authored by Engr. James Nnanyelugo of the Educare Technical Team
The panel’s scrutiny of JAMB’s quality assurance frameworks revealed no evidence of manual post-processing or intentional interference, reinforcing the error’s inadvertent nature. The report praised JAMB’s transparency, noting, “JAMB opened its systems to independent reviews to restore public confidence and ensure the reliability of the UTME for all stakeholders.”
In response, Professor Oloyede held a press briefing at 3:00 p.m. on May 14, issuing a public apology and announcing that affected candidates could retake the exam at no cost.


To mitigate scheduling conflicts with the ongoing Senior School Certificate Examination (SSCE), JAMB coordinated with the West African Examinations Council (WAEC). Candidates were directed to reprint their examination slips by May 17 to confirm rescheduled test dates.
The report highlighted JAMB’s communique, MAN PROPOSES, GOD DISPOSES, which included an “Appeal, Appreciation, and Apology” section, reaffirming its commitment to fairness and transparency.
The technical review exposed vulnerabilities in JAMB’s deployment processes, prompting commitments to implement stronger validation protocols and real-time monitoring. The report concluded, “This review, conducted with thoroughness and transparency, signifies JAMB’s resolve to uphold the sanctity of its examination processes.” The incident underscored the challenges of scaling technology for high-stakes national examinations, particularly under human oversight.
The incident also raises broader questions about the robustness of Nigeria’s examination infrastructure. The report’s call for enhanced protocols reflects a path toward resilience, ensuring that future UTME iterations prioritise reliability and fairness.




