CBSE’s on-screen marking explained: What really happens to your Class 12 answer sheet
For students, the CBSE Class 12 Board exam is that stage of life when school no longer remains routine, it starts feeling consequential. The marks they score in this high stakes exam do not just close a chapter. Instead, they decide what opens next: College admissions, course choices, scholarships, eligibility cut-offs, and, in many cases, a young aspirant’s first real sense of direction. That is why evaluation in Class 12 is an extremely sensitive exercise. Even a small marking error can alter rank lists, narrow options, or bring down confidence. In order to improve accuracy, consistency and monitor the marking process, CBSE has decided to roll out On-Screen Marking (OSM) for Class 12 answer scripts from the 2026 examinations while Class 10 evaluation will continue in the traditional way for now. However, way back in 2014, the Board had run a pilot of on-screen marking for Class 10 answer scripts. In an exam of this scale, how answers are evaluated matters as much as how students perform.
What On-Screen Marking actually means in practice
CBSE’s Class 12 Board exams in 2026 will feel completely familiar inside the exam hall. Students will still write with a pen, in the same stitched answer booklet, under the same invigilated setting. Nothing changes at that stage. The real change begins only after the exam is over.Once a student hands over the answer script, the copy will no longer move through the old paper-heavy route of sealed bundles being sent to evaluation centres for manual checking. Instead, it will enter a more controlled digital system, the OSM. In simple terms, OSM means the checking shifts to a computer screen. The answer book is first scanned and uploaded to a secure central platform. Examiners then log in with authorised credentials, open the digitised scripts, and award marks on screen rather than flipping through physical copies.
CBSE Class 12 OSM: How will on-screen marking work?
As already mentioned, nothing really changes for students inside the exam hall. The shift begins after the exam ends, and it unfolds quietly within the system that teachers and schools now have to prepare for. The backbone of this transition lies in how examiners are identified, trained and brought onto the platform. CBSE has asked all affiliated schools to update detailed records of their Class XI and XII teachers on their portal named Online Affiliated School Information System (OASIS). The data—ranging from subject expertise to contact details—forms the pool from which examiners are mapped and given access to the evaluation system.Once this database is in place, teachers are onboarded digitally. Login credentials are sent to their registered email IDs, along with OTP-based authentication on their mobile numbers. The process is designed to ensure that only verified examiners can access the platform. On first login, teachers are required to secure their accounts and familiarise themselves with the system before evaluation begins.One of the most telling aspects of the rollout is the emphasis on preparation. Multiple rounds of mock evaluation have been built into the process, including a large-scale, synchronised “mass mock” where teachers log in at scheduled times and practise evaluating sample answer scripts. The mock evaluation system is designed to ensure that before real answer books appear on screen, examiners should already be comfortable navigating the system, entering marks, and working within a digital interface.When the actual evaluation begins, it moves entirely onto the Digital Evaluation Platform. Once digitised, scripts are allotted to teachers through the platform. The examiner logs in and checks the answer book on a computer screen instead of using the physical bundle. Teachers are assigned answer books in small lots; once one batch is completed, the next is allotted by the system. Marks are entered question-wise, and the software handles the totalling and tabulation automatically. That means the examiner still judges the quality of each answer, but the arithmetic is taken out of manual hands.CBSE has also made it clear that institutions must ensure technical readiness—functional computer systems, stable internet connectivity, and uninterrupted power supply. In addition, a dedicated dashboard allows principals to monitor whether teachers have logged in, completed mock sessions, and are ready for live evaluation. Simply put, the responsibility for smooth execution is now distributed across both the Board and schools.
What changes for students and what OSM can improve
Under OSM, the handling of the script becomes more structured. Once the copy is digitised, it moves through a system where each stage — from allocation to evaluation — is tracked. Marks are entered directly into the platform, and the system calculates totals automatically. This reduces the possibility of arithmetic errors, which have traditionally been one of the most common reasons students apply for verification.The system also ensures that the evaluation process is more complete in a procedural sense. Since answers are checked on screen and marks are entered question-wise, the chances of a response being left unchecked or skipped by oversight become lower. The flow of evaluation becomes more standardised across examiners.There is also a practical shift in how quickly things can move. Without the need to physically transport answer books and organise large evaluation centres, the process can become more efficient. While timelines depend on multiple factors, the structure of OSM is designed to reduce delays that arise from logistics.At the same time, it is important to understand what OSM does not change.The quality of an answer is still judged by a teacher. The examiner still reads what the student has written, interprets it against the marking scheme, and decides how many marks it deserves. That part of evaluation remains entirely human. A digital system can organise the process and reduce clerical errors, but it does not standardise judgement in the way a machine would. In that sense, OSM improves the accuracy of the process, not the nature of the assessment.