ALL Metrics
-
Views
-
Downloads
Get PDF
Get XML
Cite
Export
Track
Method Article

Evidence-based operational model for conducting an online, synchronous, proctored, closed book professional examination during the COVID-19 pandemic lockdown

[version 1; peer review: 1 approved with reservations, 1 not approved]
PUBLISHED 07 Mar 2023
Author details Author details
OPEN PEER REVIEW
REVIEWER STATUS

Abstract

Background: Medical education has migrated online as a result of the COVID-19 pandemic. Formative and summative evaluation are critical in medical education to support the learning process. However, many problems occur during online assessment, such as internet access, proctoring, and reliable software. To mitigate this situation, there is presently no standardised model available that demonstrates reliability and validity, particularly when it involves high-stake examinations. The goal of this study was to introduce an evidence-based operational model that was used to conduct a comprehensive high stake online examination to Year 2 MBBS students during the COVID-19 pandemic. Prior to entering the clinical years (year 3-5), year 2 MBBS students were expected to pass their high-stake examination earlier conducted as face-to-face session which is now administered online.
Methods: The model proposed enables the conduct of an online, synchronous, proctored, closed-book examination in real-time that is comparable to the face-to-face (FTF) manual examination. The examination and invigilation were carried out using the Learning Management System (LMS) and Google Meet applications with the existing information technology (IT) facilities and the Standard Operating Procedure (SOP) was developed to optimise and integrate the technology effectively.
Results: Both students and faculty were satisfied with the online examination, with the latter being more so. The online examination with a Cronbach alpha score of 0.71 suggested good internal consistency and reliability and various test of validity ensured this online model was dependable, reproducible and concurrent with other examinations.
Conclusions: We concluded that this operational model is a credible alternative to the FTF examination that requires new learning technology and is very cost effective. Thus, we highly recommend this model for all examinations that involve participation of off campus students.

Keywords

Online Examination, Synchronous, Proctored, Closed Book Examination, Assessment

Introduction

Assessments are an essential element of medical education, and when combined with subsequent feedback on students' performance, they serve a critical role in assisting students in improving their learning (Preston et al., 2020). As Bransford et al. point out, assessment is a critical component of effective learning; thus, formative and summative assessment constitute one of the many important components of formal education (Muzaffar et al., 2020, Bransford et al., 1999). The authors indicate that teaching and learning processes need to be assessment-centred to provide learners with opportunities to demonstrate their developing abilities and receive support to enhance their learning. This is especially true for any examination with a high-stake component. Under these circumstances, the examination department's responsibility in conducting a fair and trustworthy examination in a conducive atmosphere becomes critical in ensuring that the program's integrity and quality are maintained through the administration of high quality, peer reviewed examinations.

With emerging technological advancements, the challenges in conducting high-stake examinations are growing, and new challenges have especially emerged during the COVID-19 pandemic in 2020. During these trying times, governments-imposed lockdowns and urged universities to deliver their curriculum using online platforms to continue students’ progress. More than delivering the teaching and learning content, the challenge was enormous in assessing student learning. This is because assessment drives learning and the need for valid and reliable online assessment tools cannot be overemphasized (Reyna, 2020).

Before the COVID-19 pandemic, medical institutions frequently conducted formative assessments using the Learning Management System (LMS) platform as the main mode to assess learning, but all summative examinations, especially high-stake examinations, were conducted face-to-face (FTF). The written paper, which included the Multiple Choice Questions (MCQ), Modified Essay Questions (MEQ), Case-Based Questions (CBQ) and Problem-Based Questions (PBQ) were conducted in the examination hall, Objective Structured Practical Examinations (OSPE) were conducted either as mobile stations or using a digital platform, Objective Structured Clinical Examinations (OSCE) were conducted as clinical stations using mannequins and simulated patients and Clinical Examinations that include long and short cases were conducted in hospitals or clinical settings. All these examinations were proctored in FTF sessions. Although many universities across the globe used assignments as their summative assessment tool in their online distance learning programs, the use of assignments in assessing a medical graduate program has been close to nil. In the current COVID-19 pandemic, online being the only mode of assessment possible, the challenge to conduct all these examinations synchronously and proctored has been a daunting task. As many teaching and learning methods were vastly prescribed, practically no guidelines were available in conducting online summative examinations that are proctored, synchronous and conducted with an acceptable degree of validity and reliability. Thus, the aim of this research was to conduct online, synchronous, proctors & closed book exam that was reliable and valid. The objectives of this study were to (1) Perform needs analysis to identify the strengths, weakness, opportunities and threat in conducting this exam. (2) To develop and design SOP and systems that will enable the conduct of this exam. (3) To conduct a high-stake professional exam using the developed and tested Exam system. (4) To evaluate the reliability and validity of this system.

In this paper, we present an operational model that was adapted in a medical and allied health university to conduct online, proctored and synchronous high-stake barrier examinations in the First Professional Examination in Year 2, and the Final Professional Examination in Year 5 of the 5-year medical (MBBS) program. We used innovative approaches using existing online tools and platforms to conduct these examinations. Our aim was to conduct an online, synchronous, proctored, high-stake summative examination that has a high degree of validity and reliability.

Methods

Proposed system

The proposed system is an examination process based on the Modified Fish Bone Online Examination Model (MFBOEM) (Figure 1).

0f2ab5d2-08c2-4c10-bbcb-52f47ae0d4dd_figure1.gif

Figure 1. Modified Fish Bone Online Examination Model (MFBOEM) (Developed by the authors).

This examination model suite is designed for a scenario in which students can take an ONLINE test anywhere in the globe, but the examination is conducted and controlled in the university's information technology laboratory by a team of examination committee experts, qualified invigilators, and a technical team This examination was conducted in the school of medicine for Year MBBS students in August 2022. The research, development, implementation, and analysis of this system was conducted in the period between June to October 2022.

Introduction

Based on the university’s prescribed examination regulation and statutory approval by the relevant university stakeholder committees (Senate/Faculty Board/Assessment & Examination Board), the Standard Operating Procedure (SOP) for the conduct of the proctored online examination protocol was adopted (Figure 1). In line with the examination regulations and SOP, an endorsed comprehensive timetable depicting chronological events starting from Step 1 to Step 9 leading towards completion of the online proctored examination was developed. The stipulated events are summarized in a modified fish bone diagram starting with the standard set guidelines in the left 3 boxes running through the parallel events for invigilators and students finally leading to effective end results as shown in Figure 1.

Procedure

The process of the online exam is described in 9 steps for easy understanding and implementation.

STEP 1: Initial communication

Students and staff were briefed through online google meet platform regarding the examination regulation and SOP for preparation and implementation of the online examination. During this session, students were also made aware of all the eligibility criteria for the ONLINE examination as indicated in Step 2 in Figure 1.

STEP 2: Requirements

Implementation of the ONLINE examination requires both hardware and software items that cater for all stakeholders (students, invigilators and IT laboratory). Requirement fulfilment and eligibility adherence of students were assessed using pre-determined Google survey forms, and further follow-up from Step 1 briefing student’s administrative shortfall (registration/ID/fees etc.) were addressed by the Year Coordinator. The forms were used to collect information regarding students concerns with regard to the online exam. The data was not used for any documentation or analysis in the exam or in this article. This form was circulated 2 weeks before the mock exam. Hardware and software issues were addressed by the relevant technical team.

Likewise, invigilator requirements (such as access to Google Meet & LMS accounts) were appraised and issues, if any, were ameliorated by one-to-one consultation with each invigilator with the help of an in-house technical team.

Technical requirements:

  • 1. Examination Venue: IT Lab 1A and 1B with the capacity of 50 desktop Personal Computers (PCs) in each laboratory.

  • 2. Operating System: Windows edition 10 Home Single Language.

  • 3. System: Desktop PC setup with earphones.

  • 4. Processor: Intel(R) Core (TM) i506400CPU@2.70GHz 2.71Ghz, 4GB RAM & 64-bit operating system.

  • 5. Internet Specification: A minimum speed of 500 Mbps through LAN cable.

  • 6. Software Specifications:

    • a. Google Meet: A video-communication portal developed by Google (G-suite education license, compatible on android and iOS platforms for online streaming, recording and storage.

    • b. Learning Management System (LMS): Moodle 3.7 version, managed by an internal professional team.

    Non-technical requirements:

  • 1. Invigilators’ requirements:

    • a. Invigilators must have access to a university allotted Google Meet account.

    • b. Invigilators must have access to university’s LMS account.

  • 2. Students’ requirements: (eligibility criteria)

    • a. Examination registration (attendance, fees clearance, identification card, etc.).

    • b. Laptop/desktop (with camera enabled).

    • c. Internet data pack of a minimum 10GB (unlimited data pack was recommended).

    • d. Access to a university allotted online Google student account for Google meet.

    • e. Access to university allotted online LMS.

    After the provisional approval of processes of online examination design plan, it was thoroughly tested by the exam and IT technical team for its technical viability, credibility and reliability, following which the final version of this exam model was approved by the relevant university boards.

STEP 3: Testing the system

Students were briefed on all enquiries based on their feedback from Step 2. Students were also instructed on the conduct of the Mock Online Examination.

Pattern of examination:

  • Use of LMS as the examination tool.

  • Proctoring was done by live visual streaming of students taking their examination by desktop/laptop and mobile devices through Google Meet with simultaneous window display of the question paper in the LMS platform.

  • Question components included single best answer, short essay question (SEQ), and case-based question (CBQ).

  • Duration: Each Single Response Answer (SRA) was allotted 90 seconds, SEQ/CBQ – 15 minutes per question.

  • The response method included checking the correct box for the options given under SRA and typing the answer in the text box provided for SEQ/CBQ.

  • Examination rules, regulations and SOPs were posted as self-read material on a specific LMS examination folder, which was made available to both staff and students.

    Training of invigilators:

    This process started with briefing on the practical training followed by hands-on experience in developing the necessary skills in the software and all the relevant platforms. The training was to familiarize the use of Google Meet, such as opening an account, creating single and multiple Google Meet session links with association of time in the Google Calendar, recording of Google session links, setting up of single and multiple links audio & video communication with students, sharing of these URLs links via email to students, and extraction of the recorded video of the Google session to the database. All invigilators were familiar with the use of LMS for examination purposes as these were used frequently during the teaching-learning activities.

STEP 4: Implementation of the mock exam

After establishing the Google Meet connection with the invigilators, the students were requested to share their entire screen with the invigilators. Upon approval from the invigilators, students were instructed to open the examination folder in the LMS. Following instruction from the invigilators, students commenced reading the examination regulations and SOP for the online examination. At the stipulated time, students were asked to open the examination question paper using a password provided individually on students’ WhatsApp. At the conclusion of the examination, students cross checked to confirm that all questions have been attempted and, with the permission of the invigilators, ensured that the submit button had been clicked to submit the answers and exit the platforms, thus marking the end of the examinations.

IT Lab Workstation Design Plan: Each invigilator was accommodated in a climate controlled, ergonomically structured IT-Lab observing a physical distance of one meter. As shown in Figure 2, individual invigilators were assigned an internet (LAN connection) enabled two (2) desktop PCs and two (2) headphone sets (ear lobe covered). Each PC had three students’ Google Meet link browsers with full screen shared depicting LMS page downsized to fit the PC screen. Thus, individual invigilators have in total six students’ Google Meet links, three each in two PCs. All information were stored in a secure server and later retrieved at the data centre as and when necessary.

0f2ab5d2-08c2-4c10-bbcb-52f47ae0d4dd_figure2.gif

Figure 2. The process involved in data generation, storage and retrieval that was adopted to conduct this examination.

Question paper setting in LMS:

  • 1. Examination question papers were made available in a specially created examination folder in the LMS.

  • 2. Question papers had pre-determined starting and ending times (after ending time, students could no longer attempt the questions).

  • 3. Students were allowed to scroll forward and backward in the question grid (i.e., they didn’t have to go by a sequential order but on a random basis.

  • 4. Questions and the distractor options are shuffled for each student.

  • 5. LMS setting ensured that the question paper cannot be downloaded.

  • 6. Students have to key in answers in the question paper in the space provided in the LMS.

Invigilation Mode: According to the time scheduled in the examination timetable, the invigilators reported to the venue and Chief Examination Invigilator for the day, and took charge of their examination workstation as per the duty allocation by the examination team and confirmed the workstation readiness as per the training instructions. Based on the student list allotment for each invigilator, which is pre-determined by the examination team (one invigilator is assigned to six students), the invigilator sent a separate student-specific Google Meet link for every student through email. The students logged in using this link and established the connection with the invigilator 30 minutes in advance of actual examination starting time. This was to ensure that any technical issues are sorted before the start of the examination. Invigilators then informed the students regarding the video recording of the session and started recording the Google Meet session followed by preparation of the student as per the prescribed SOPs as summarized below.

  • 1. Students will display their ID cards/Examination hall tickets.

  • 2. Students will show their surrounding (360 degrees) area including the desk through desktop/laptop camera by rotation of the camera to confirm that there is no reading materials/other electronic devices (apart from the permissible mobile phone) or any other persons in the space identified by the student taking the examination (Figure 3 Student’s Work Station). Students are informed that the invigilators are at the liberty to make such inspection at any point of time when there is suspicion aroused by the student’s activity.

  • 3. Students are directed to arrange their seating positions such that they face the wall on two sides (Figure 3).

  • 4. Students are directed to place their mobile phone with camera turned on and connected all through the examination on to the wall on their left or right side depending on their seating arrangements (Figures 3 and 4).

  • 5. Students are advised to keep their session audio unmuted throughout the examination period.

  • 6. Students will now be directed to open the LMS in another browser, and share this browser with the invigilator and open the examination folder and start reading the examination hall regulation and SOP for online examination under the supervision of the invigilator for 30 minutes.

  • 7. Students will now be directed to open the online question paper as per the allotted examination starting time with the password for the question paper shared to individual students five minutes before the examination starting time.

  • 8. Students will be advised to start attempting the question paper in real time. They are also reminded that the question paper is time bound (i.e., it will open and close at a pre-determined time setting), and no downloading is allowed.

  • 9. Six students will be proctored by one invigilator for the entire period of the examination.

  • 10. Students will be advised to raise their hand (physically) in front of the camera in case any trouble shooting is required to notify the concerned invigilator. The invigilator would then assist them individually without disturbing the other students (mute all other students except the one with issues) as each student is connected to the invigilator with specific link to avoid overlap.

  • 11. Students are advised to contact their invigilator immediately through their WhatsApp video call/normal call in case they have issues with their PCs/internet connection. They are also advised to be in touch with their invigilator until the issues are addressed by the examination/technical team.

  • 12. Students are advised that, as per the online examination rules, if the student loses internet connectivity or experiences any other technical issues that cannot be rectified within 15 minutes, the student will have to take the same exam on another day.

  • 13. Students who wish to use the washroom will have to notify the invigilator, as explained in Step 8 above.

  • 14. Students are advised to focus on the screen as much as possible with reasonable humane excuses, but if the invigilator detects suspicious eye movement of the student, the invigilator can ask the student to display the surrounding, as explained in Step 2 with warning, and three such recorded warnings can be considered as exam irregularity.

  • 15. Students are notified of the last 15 minutes before examination ending time.

  • 16. Students are directed to click on the submit button in the LMS page after they have completed their examination.

  • 17. Students are advised to exit first LMS and then Google Meet session when the session recording stops.

0f2ab5d2-08c2-4c10-bbcb-52f47ae0d4dd_figure3.gif

Figure 3. Student workstation (ariel view).

0f2ab5d2-08c2-4c10-bbcb-52f47ae0d4dd_figure4.gif

Figure 4. Fields of vision available to invigilators.

In the event that the invigilator needs to leave his\her workstation, the invigilator must notify the Chief Examination Invigilator for the day so that a substitute invigilator could be arranged for that period. The invigilator will end the session after affirming that all events have been completed and send the recorded video link for each student for documentation and storage to the Chief Examination Invigilator.

Invigilators will be required to sign the Examination Oath Declaration form before commencement of the examination activity. The Chief Examination Invigilator, together with the Examination Coordinator and Examination Team, will supervise the conduct of the examination.

Security check: The invigilators were instructed to check for screen and cursor movements. The movements observed on the screen were expected to be active indicating the active use of the cursor. The signals to indicate full screen sharing/student eye movement/LMS and Google Meet log were stringently observed thus providing evidence on log activity during the examination time for each student.

STEP 5: Student mock/trial examination and proctored online examination invigilation feedback

After completing their mock examinations, students were encouraged to complete the pre-determined Google Survey Form to share their experiences, difficulties and any suggestions for improvement. The Google Survey form can be found under Extended data (Mohanraj, 2022).

Similarly, after completing the online mock examinations, invigilators are encouraged to complete the pre-determined Google Survey Form to share their experience and difficulties, how they handled issues (if any) and finally, any suggestions for improvement.

Separately designed Google Survey Form links are distributed via email to both students and invigilators.

STEP 6: Students’ and invigilators’ briefing on post mock/trial examination (report from feedback 1)

Students were briefed (arranged by online Google Meet session) on the overall conduct of the mock examinations (do’s and don’ts emerging from the mock examinations) in line with the invigilators’ inputs of proctoring students sensitized with technical short falls observed (if any) during the examination, technical guidelines, and steps to improve the situation, were recommended. Concerns raised from the students’ feedback will be looked into and addressed accordingly.

Invigilators were also briefed (arranged by online Google Meet session for briefing) on the overall conduct of the mock examinations. Students’ feedback on the examination conducted and proctoring issues (if any) were raised, addressed and rectified. Concerns raised from invigilators’ feedback were shared among all invigilators and solutions discussed.

Taking into account students’ and staff’s feedback on the two mock examinations, the examination department will address and correct all issues. During the Professional Examination, no technical issues were encountered.

STEP 7: Students’ and invigilators’ briefing on the final proctored online examination

Students were reminded of the Examination Regulation and SOPs for the final online examination. Students were made aware of the changes incorporated into the examination conduct based on feedback and asked to adhere to the dress code during the online examination as per the examination rules and regulations. Invigilators were briefed on the overall proctoring guidelines and changes incorporated thereof, based on the experience and inputs from students’ and invigilators’ feedback from the trial runs of the mock examinations.

STEP 8: Student final examination and proctored online examination invigilation

According to the time scheduled in the examination timetable, upon fulfilment of all online examination prescribed requirements, the final online examination was conducted as explained in step. 4 and with all ground experience endured and considering the feedback from both students and invigilators during the trial mock examination.

The invigilators will check with the Chief Examination Invigilator, who will ensure that all students have attempted and submitted their examination answers correctly in the LMS by cross-checking with the technical team (LMS) before releasing the students. The Chief Examination Invigilator, together with the invigilators, will compile the invigilation report at the end of the examination, to be submitted to the Examination Unit.

STEP 9: Student final exam and proctored online examination invigilation feedback

Students and invigilators, after completing the proctored online final examinations, are encouraged to complete the pre-determined Google Survey Form to share their experiences, difficulties and any suggestions for improvement. Separately designed Google Survey Form links are distributed via email to both students and invigilators. All feedback from both students and invigilators are processed and analysed. Output from the analysis paves the way towards amending the online examination regulation SOPs. This marks the end of the proctored online examination process.

Measures to mitigate malpractice

Our process has been tested to address the possibility of malpractice by using the following approach:

  • 1. Use of two video cameras: The use of video feed from the student’s phone and laptop gave us a wide angle to view his/her working space. This prevented the student from using any paper or technology-based information as a malpractice tool.

  • 2. Monitoring student’s eye movement and keyboard activities: The use of split screens ensured that every time the student looked away from the screen, he/she was expected to look down to type his/her answers. Whenever the student’s keyboard was in use, the content typed would be visible to the invigilator to view on the LMS Examination page. The invigilator monitored the student’s gaze and also scrutinized his/her keyboard activity. Appropriate warning would be given if change in his/her point of gaze was observed and/or if the keyboard was found idle for a considerable amount of time to raise any suspicion of malpractice.

  • 3. Preventing third party verbal assistance by using the sound system: The students were instructed to keep their microphones on their devices on at all times. This was to ensure that there was no verbal assistance from anyone in the vicinity of the candidate.

  • 4. Preventing use of digital or paper-based aids by using desk and surrounding check protocols: The students were instructed that anytime the invigilator’s suspicion was credibly aroused, they would be requested to use their camera to show their desk space and the surrounding environment. This would improve the invigilators confidence and freedom to enforce quality invigilation.

  • 5. Preventing impersonation by using LMS Logs: Using IT support, we were able to monitor the number of devices our students used to log on to their LMS page. This was to ensure only one device was used in the LMS Examination page activation. This ensured that there was no ghost writer who could take the examination on the student’s behalf. Along with student gaze and keyboard activity monitoring system, the process ensured that impersonation was impossible on this online examination platform.

Taking into account all these measures, we further had the confidence in the professionalism imparted in our curriculum that is regarded to be essential by many researchers (Ludmerer, 1999, Jaiprakash Mohanraj, 2020), which are tested in such circumstances.

Results and discussion

The proposed proctored, synchronous, closed book, online examinations for barrier examinations were conducted for Year 2 and Year 5 MBBS students from a Malaysian Medical and Healthcare University. We present the findings from the Year 2 Professional MBBS Examinations using the above proposed system. As indicated in Tables 1 and 2, 52% of the student population were women and within this cohort, 52% were international students. The students were between the age group of 18 to 21 years. The exams were proctored by twenty-six lecturers who were aged between 35 to 70 years old. As most of the Teaching and Learning Activities (TLA) activities were conducted on LMS and by using Google Meet during the online session, both students and lecturers were comfortable and welcoming of the use of this new system. All the students took their examinations from the comfort of their homes or other preferred locations. It is worth noting that all the international students (from over eight countries) participated in this examination from their home countries. The full raw data can be found under Underlying data (Mohanraj, 2022).

Table 1. Gender distribution.

FrequencyPercent
Male7648.1
Female8251.9
Total158100.0

Table 2. Nationality.

FrequencyPercent
Local7748.7
International8151.3
Total158100.0

Test for reliability: Table 3 indicates the mean score of the cohort in all their continuous assessment exams that were conducted face to face and the mean score for the Professional exam conducted online. A test for reliability shown in Table 4 indicates a Cronbach alpha score of 0.714 with indicates that the students’ performance across all these exams were reliable and consistent. The results indicate that students’ performance in this online examination is consistent with all their previous continuous assessments held earlier in their academic year.

Table 3. Scores in CA & professional exam.

Mean (%)SDN
EPI(F2F)68.8010.55158
GIN(F2F)69.3310.95158
MSK(F2F)67.3610.93158
UIR(F2F)60.9810.24158
NVS(F2F)77.318.85158
RPD(F2F)77.207.50158
EDM(F2F)70.5011.59158
PRO(OPS)58.4011.02158

Table 4. Reliability statistics.

Cronbach's alphaCronbach's alpha based on standardized itemsNo of items
0.7120.7148

Test for validity of method adopted to conduct online examination: To test the extent to which this assessment tool measures what it claims to measure, we used the following tests.

Face validity: Students and staff perception were taken as feedback after the exam. The results indicate that both staff and students were quite satisfied with the process, its convenience and showed a significant amount of confidence in this system. Table 5 summarizes the students’ opinions regarding these exams.

Table 5. Students and staff feedback.

Feedback questions (Students & Staff [n=150])MeanSD
1Are you satisfied with the overall environment of Online Examination?4.170.92
2The process of this Online examination was ideal for conducting Profession Exam.3.860.98
3This Online exam does not compromise student's experience of this process, in any manner.3.541.09
4I was very comfortable in handling the technology and device during this examination.4.191.04
5I am very satisfied with this Online examination process.3.890.97
6This Online exam was as good as the face-to-face exam.3.701.08

Content validity: Since all the measuring tools used in the face-to-face exams were used in this online exam, none of the assessment strategies from any of the previous examination were altered or modified. This exam tested the delivered curriculum in depth and range which was supported by positive reviews by independent external examiners.

Criterion validity: We tested for correlation between the continuous assessment and the online Professional exam results obtained by this cohort of student. The results indicated in Table 6 shows that, except for the GIN block results, all the other block performance were positively correlated with a significance at the level of 0.01. These results indicate that this online examination method yielded results that reflect the same construct of student’s performance in continuous assessment. Figure 5 also indicates that the performance of this cohort of students, in terms of pass percentage, was within the range of previous performance observed on the last decade.

Table 6. Correlation matrix.

EPI(F2F)GIN(F2F)MSK(F2F)UIR(F2F)NVS(F2F)RPD(F2F)EDM(F2F)
PRO (OPS)Pearson0.431**0.0640.313**0.562**0.472**0.289**0.454**
Sig.0.0000.4270.0000.0000.0000.0000.000

** Correlation is significant at the 0.01 level (2-tailed).

* Correlation is significant at the 0.05 level (2-tailed).

0f2ab5d2-08c2-4c10-bbcb-52f47ae0d4dd_figure5.gif

Figure 5. The overall pass percentage in previous and current cohort.

The proctored, synchronous, closed-book, online test described above has been demonstrated to be a trustworthy and legitimate procedure that may be utilised to administer any high-stake exams. With a major portion of 2020 and 2021 affected by COVID-19, medical education has been compelled to shift to the online delivery of content and thus arose the need to find a reliable online method of assessment of the curriculum. Currently, there are varying practices in many schools and universities that have adapted online assessment tools to fit their purpose; however, there are no standard guidelines, reliable tools or methods that have been proven to be reliable alternatives in the education sector. Thus, in this paper, we demonstrated that our suggested examination procedure is an effective instrument for assessing the curriculum and is comparable in reliability, efficiency and convenience of the time-honoured face-to-face mode of assessment. Additionally, we propose that this examination method be utilised in conjunction with assignments to assess students' performance in various distance learning modules.

One of the most advantageous features of our suggested online testing approach is its low cost. We utilised the current Learning Management System (LMS), which is a free Moodle-based system, in conjunction with the complimentary Google Meet account. The majority of institutions of higher learning have implemented a curriculum delivery method that is heavily reliant on an IT-based support system. This translates into the usage of LMSs for sharing learning materials and administering exams, as well as the use of computer laboratories for administering Computer-Assisted Learning. We have proven that our suggested approach may be used to administer a successful online test utilising the existing technology and essentially at no additional expense.

Our online, proctored examination was very well received by both teachers and students, with particular emphasis on the simplicity with which this innovative approach could be adapted. The student, being tech-savvy, had no trouble understanding the information and following the protocol. Digital divide has been observed between students and lecturers by many researchers (Mohanraj et al., 2019, Sabqat and Khan, 2019), and this seems to be growing with time. In this scenario, our success was with the fact that even with a fair number of the examiners involved there were digital novice and digital immigrants, there were no issues in grasping this technology dependent process. One factor that facilitated the universal acceptability is that there was no requirement to learn a new software, as we used familiar tools and tweaked it to our purpose. Researchers have shown students are subjected to stress in their academic journey and examination was one of the major contributing factors (Sohail, 2013, Fares et al., 2016) that could determine their performance. Thus, our effort was to provide a system that does not add any additional stress on the student’s process of taking their exams, which was achieved by adapting well known and familiar platforms integrated into the system.

The limitations in our systems included the necessity to achieve a good student to staff ratio. In a traditional face-to-face environment, the common practice is to assign one staff member to every 20 students; however, in our system, we used one staff member for every six students. We understand that, the lower the staff and student ratio, the better the quality of invigilation. This could be mitigated by engaging part-time invigilator who could be specifically trained in this exercise. Secondly, the strength of the internet connectivity for both the students and invigilators is the backbone of conducting a smooth online examination. This could be hampered due to unforeseen issues or lack of student preparation (with regard to the data package availability). Lastly, any technology dependent system is as good as it’s infra structure and the support staff that maintain it; we recognise that this can be a short coming is some educational institutions. However, since we propose to use only the existing system, a good internet connectivity in the institution with an up-to-date operating system is all that is required to conduct this examination. We recommend that this examination system to be implemented among multiple cohorts both with a university and across different university to improve its reliability and validity.

Conclusions

The suggested proctored, synchronous, closed book, online examination can assist in the transformation of manual ways of performing ongoing assessments and high-stake examinations on this online platform for remote learners, which is especially important in these trying times. This system has a high reliability index and repeatability, guaranteeing that the standard and quality of a face-to-face examination are fulfilled in every way.

This method is extremely cost effective because it does not necessitate the purchase of any additional hardware or software. This form of online examination may be performed at any educational institution with their existing IT support, good internet access, and a resourceful IT support team. Furthermore, because no new software or hardware is utilised, no substantial training/workshop is necessary for invigilators or students to comprehend and operate this method.

Thus, we suggest that this proctored, synchronous, closed book, online examination system is reliable, efficient, effective, economical and convenient and recommend it to be used in all the educational sectors that engage in student learning.

Data availability

Underlying data

Figshare: Data for Repository.xlsx. https://doi.org/10.6084/m9.figshare.20024183.v2 (Mohanraj, 2022).

This project contains the following underlying data:

  • - Data for Repository.xlsx [The information provided contains raw student data obtained from multiple summative examination and from different cohorts. All the data provided have been anonymised as per the criteria.]

  • - FigShare_Online Exam Feedback Form - Student (Responses)-Raw.xlsx

Extended data

This project contains the following extended data:

  • - Student Online Exam Feedback Form.pdf

Data are available under the terms of the Creative Commons Attribution 4.0 International license (CC-BY 4.0).

Author contributions

  • The corresponding author is responsible for ensuring that the descriptions are accurate and agreed by all authors.

  • All the authors were involved in conceptualization and validation of this study.

  • Jaiprakash Mohanraj (JPM), Ivan Rolland Karkada (IRK) were involved in data collection & resources management.

  • JPH, Rusli Bin Nordin (RN) & IRK were involved in writing the original draft, review and editing process.

  • JPM & IRK were involved in Project administration, visualization and writing of the original draft.

  • JPM, RN & AFS were also involved supervision.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 07 Mar 2023
Comment
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
Views Downloads
F1000Research - -
PubMed Central
Data from PMC are received and updated monthly.
- -
Citations
CITE
how to cite this article
Mohanraj J, Rolland Karkada I and Nordin R. Evidence-based operational model for conducting an online, synchronous, proctored, closed book professional examination during the COVID-19 pandemic lockdown [version 1; peer review: 1 approved with reservations, 1 not approved]. F1000Research 2023, 12:249 (https://doi.org/10.12688/f1000research.122301.1)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Open Peer Review

Current Reviewer Status: ?
Key to Reviewer Statuses VIEW
ApprovedThe paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approvedFundamental flaws in the paper seriously undermine the findings and conclusions
Version 1
VERSION 1
PUBLISHED 07 Mar 2023
Views
1
Cite
Reviewer Report 19 Jul 2024
Maggie Hartnett, Massey University, Palmerston North, Manawatu-Wanganui, New Zealand 
Not Approved
VIEWS 1
Thank you for the opportunity to read the manuscript of a solution adopted by one university to replace f2f invigilated exams with online, synchronous, invigilated exams.

While I have answered yes to the above questions (because the ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Hartnett M. Reviewer Report For: Evidence-based operational model for conducting an online, synchronous, proctored, closed book professional examination during the COVID-19 pandemic lockdown [version 1; peer review: 1 approved with reservations, 1 not approved]. F1000Research 2023, 12:249 (https://doi.org/10.5256/f1000research.134273.r290474)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
Views
4
Cite
Reviewer Report 03 Jul 2024
Aisha Rafi, Shifa College of Medicine, Shifa Tameer e Millat University, Islamabad, Pakistan 
Approved with Reservations
VIEWS 4
The study has introduced an online assessment model developed during COVID-19 lockdown. The nine steps modified fish bone online examination model (MFBOEM) has been developed after approval from University senate, faculty board and assessment committee.
Clear guidelines and SOPs ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Rafi A. Reviewer Report For: Evidence-based operational model for conducting an online, synchronous, proctored, closed book professional examination during the COVID-19 pandemic lockdown [version 1; peer review: 1 approved with reservations, 1 not approved]. F1000Research 2023, 12:249 (https://doi.org/10.5256/f1000research.134273.r290476)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 07 Mar 2023
Comment
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions
Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.