Onscreen evaluation system provides many advantages

  • Ability to generate result in quick time with accuracy.
  • Digital management and storage of answer sheets
  • Easy retrieval of the answer sheets any time from any location
  • Examiners can evaluate digital copies of the answer sheets in quick time
  • Accuracy of evaluation improves significantly.

However any technology implementation usually depends on participation of its stakeholders and proper training practices adopted by the university. Lack of training or hands on exposure about onscreen evaluation system can hamper progress and overall evaluation process for the education institute or university.

Following are some of the key mistakes which should be avoided if you wish to make onscreen evaluation implementation a success for your education institution.

1. Not Implementing pilot phase

For any system implementation, it should be adopted in phased manner after taking continuous feedback from the stakeholders. Pilot phase is key phase which can help institutes to adopt new system. Pilot phase allows examiners to have mock evaluation process.

Pilot phase can be monitored by platform technology experts along with key stake- holders of the education institution including controller of examination, registrar.

 

There are institutions who adopted onscreen evaluation system in a hurry without having any testing or pilot phase.

It has resulted in errors during answer sheet evaluation phase.

Acceptability of new system goes down  if you skip pilot phase to get essential feedback about overall system functionality.


Onscreen evaluation platform
of Splashgain follows best practices and pilot phase is one of the essential phase of implementation of the entire process.

 

Pilot phase has helped many universities, autonomous institutes to get hands on experience of system and it resulted in successful adoption of the system by various evaluators and faculty members of the education institution.

2. Not Providing  Enough hands on training to evaluators/ Moderators

 

Training for onscreen marking system

In addition to pilot phase having training material along with live training sessions about how to use onscreen evaluation process is also critical for the overall success of the process.

 

Before any examiner starts evaluation process there should be hands on training session should be conducted. Without such training sessions, examiners would not be able to understand functionality and usability of the system. It can hamper successful adoption and implementation of onscreen evaluation process.

There are universities which tried to implement onscreen evaluation system on large scale.

It failed to provide proper hand on training to each evaluator or moderator.

Such things increases resistance of using new system. Teachers, Professors can oppose adoption of new system due to lack of training.

Onscreen evaluation platform of Splashgain provided live class oriented training sessions to each evaluator. There are specific training videos, help documents available. It helps each evaluators to understand entire system with ease.

 

Onscreen evaluation system can cut short time required to evaluate particular answer sheet. It can help to evaluate individual answer sheet within 4 to 5 minutes time. Proper training can make this possibility. If evaluators are not trained properly and they are not aware about usage and features of the system then adoption of this system would be difficult.

 

3. Not Providing Support during evaluation Phase

Feedback during implementation onscreen marking

During evaluation of answer sheet , evaluator may face technical or functional difficulty. In such case proper help or support should be available to resolve queries in order to have smooth execution of answer sheet evaluation phase.

If your system fails to provide right kind of support on timely basis then frustration or dissatisfaction increases among professor or evaluator and it can add to delay in result processing.

If your university is adopting onscreen evaluation system on large scale where hundreds of evaluators are going to do digital evaluation of answer sheets then it is essential to have support system in the form of live chat, helpline phone, email support to resolve queries of the evaluators.

Onscreen evaluation platform of Splashgain provides extensive and dedicated support during evaluation phase. There are dedicated helpline numbers, instant email support, live chat support provided to resolve queries arising during evaluation phase.

 

4.Not keeping track on Online Storage of Answer Sheet Digital Copies

digital storage of answer scripts

Digital onscreen marking helps to simplify result processing. It can help you to declare results in quick time. However as a process of audit, it is important to store digital copies of the answer sheets for the specific period of 3 to 5 years as per norms set by University Grant Commission or relevant entity of your education institution.

 

There should be proper mechanism to store evaluated copies in secure way and you should be able to retrieve any copy in future as per requirement. Many entities have failed to define proper process for storage of digital copies of the answer sheets. They even failed to have backup of the historical digital copies of the answer sheet.

 

It can result in process non compliance and can hamper brand value of the university. Students may file RTI for retrieval of the  evaluated answer sheet. In such case there should be proper way to get historical digital copy of the answer sheet and it should be issued to student as per request.

 

Splashgain maintains all historical answer sheets in secure blob storage with additional backup/ fail over mechanism. It helps education institution to retrieve any answer sheet using just admin panel and suitable credentials.

 

5. Not taking Feedback from Examiners, Moderators and Controller of Examiner

feedback of examiner during onscreen marking phase

Successful adoption or execution of any system depends on feedback mechanism. If you consider any online service providers then it can be observed that it continuously tries to get feedback of the user. Such feedback mechanism is helpful to improve usability, functionality of the system.

Success of Onscreen evaluation system depends on feedback provided by professors, evaluators, moderators along with senior members of the education institution or university.

 

Many education institutions adopted onscreen evaluation system where feedback mechanism was missing. It could provide to be hindrance to successful adoption of the system across education institution and you may not be able to derive all the benefits of the platform.

 

Conclusion

 

Success of any new system implementation depends on active usage and participation from its stakeholders. Continuous feedback and improvement loop helps to take system to the next level.

Onscreen evaluation system is helping universities and education institutes to simplify result processing mechanism. There are some of the universities who are able to cut short result processing time from 45 days to just 8 days. Entire logistical activity and co ordination process has simplified and it has resulted in cost saving for the institution. If you avoid those 5 mistakes mentioned in the article then successful adoption of onscreen evaluation system would be reality for your education institute.

 

Have you faced any other challenges during onscreen evaluation implementation process ? Please share with us.