Keywords
Online Examination, Synchronous, Proctored, Closed Book Examination, Assessment
Online Examination, Synchronous, Proctored, Closed Book Examination, Assessment
Assessments are an essential element of medical education, and when combined with subsequent feedback on students' performance, they serve a critical role in assisting students in improving their learning (Preston et al., 2020). As Bransford et al. point out, assessment is a critical component of effective learning; thus, formative and summative assessment constitute one of the many important components of formal education (Muzaffar et al., 2020, Bransford et al., 1999). The authors indicate that teaching and learning processes need to be assessment-centred to provide learners with opportunities to demonstrate their developing abilities and receive support to enhance their learning. This is especially true for any examination with a high-stake component. Under these circumstances, the examination department's responsibility in conducting a fair and trustworthy examination in a conducive atmosphere becomes critical in ensuring that the program's integrity and quality are maintained through the administration of high quality, peer reviewed examinations.
With emerging technological advancements, the challenges in conducting high-stake examinations are growing, and new challenges have especially emerged during the COVID-19 pandemic in 2020. During these trying times, governments-imposed lockdowns and urged universities to deliver their curriculum using online platforms to continue students’ progress. More than delivering the teaching and learning content, the challenge was enormous in assessing student learning. This is because assessment drives learning and the need for valid and reliable online assessment tools cannot be overemphasized (Reyna, 2020).
Before the COVID-19 pandemic, medical institutions frequently conducted formative assessments using the Learning Management System (LMS) platform as the main mode to assess learning, but all summative examinations, especially high-stake examinations, were conducted face-to-face (FTF). The written paper, which included the Multiple Choice Questions (MCQ), Modified Essay Questions (MEQ), Case-Based Questions (CBQ) and Problem-Based Questions (PBQ) were conducted in the examination hall, Objective Structured Practical Examinations (OSPE) were conducted either as mobile stations or using a digital platform, Objective Structured Clinical Examinations (OSCE) were conducted as clinical stations using mannequins and simulated patients and Clinical Examinations that include long and short cases were conducted in hospitals or clinical settings. All these examinations were proctored in FTF sessions. Although many universities across the globe used assignments as their summative assessment tool in their online distance learning programs, the use of assignments in assessing a medical graduate program has been close to nil. In the current COVID-19 pandemic, online being the only mode of assessment possible, the challenge to conduct all these examinations synchronously and proctored has been a daunting task. As many teaching and learning methods were vastly prescribed, practically no guidelines were available in conducting online summative examinations that are proctored, synchronous and conducted with an acceptable degree of validity and reliability. Thus, the aim of this research was to conduct online, synchronous, proctors & closed book exam that was reliable and valid. The objectives of this study were to (1) Perform needs analysis to identify the strengths, weakness, opportunities and threat in conducting this exam. (2) To develop and design SOP and systems that will enable the conduct of this exam. (3) To conduct a high-stake professional exam using the developed and tested Exam system. (4) To evaluate the reliability and validity of this system.
In this paper, we present an operational model that was adapted in a medical and allied health university to conduct online, proctored and synchronous high-stake barrier examinations in the First Professional Examination in Year 2, and the Final Professional Examination in Year 5 of the 5-year medical (MBBS) program. We used innovative approaches using existing online tools and platforms to conduct these examinations. Our aim was to conduct an online, synchronous, proctored, high-stake summative examination that has a high degree of validity and reliability.
The proposed system is an examination process based on the Modified Fish Bone Online Examination Model (MFBOEM) (Figure 1).
This examination model suite is designed for a scenario in which students can take an ONLINE test anywhere in the globe, but the examination is conducted and controlled in the university's information technology laboratory by a team of examination committee experts, qualified invigilators, and a technical team This examination was conducted in the school of medicine for Year MBBS students in August 2022. The research, development, implementation, and analysis of this system was conducted in the period between June to October 2022.
Based on the university’s prescribed examination regulation and statutory approval by the relevant university stakeholder committees (Senate/Faculty Board/Assessment & Examination Board), the Standard Operating Procedure (SOP) for the conduct of the proctored online examination protocol was adopted (Figure 1). In line with the examination regulations and SOP, an endorsed comprehensive timetable depicting chronological events starting from Step 1 to Step 9 leading towards completion of the online proctored examination was developed. The stipulated events are summarized in a modified fish bone diagram starting with the standard set guidelines in the left 3 boxes running through the parallel events for invigilators and students finally leading to effective end results as shown in Figure 1.
The process of the online exam is described in 9 steps for easy understanding and implementation.
STEP 1: Initial communication
Students and staff were briefed through online google meet platform regarding the examination regulation and SOP for preparation and implementation of the online examination. During this session, students were also made aware of all the eligibility criteria for the ONLINE examination as indicated in Step 2 in Figure 1.
STEP 2: Requirements
Implementation of the ONLINE examination requires both hardware and software items that cater for all stakeholders (students, invigilators and IT laboratory). Requirement fulfilment and eligibility adherence of students were assessed using pre-determined Google survey forms, and further follow-up from Step 1 briefing student’s administrative shortfall (registration/ID/fees etc.) were addressed by the Year Coordinator. The forms were used to collect information regarding students concerns with regard to the online exam. The data was not used for any documentation or analysis in the exam or in this article. This form was circulated 2 weeks before the mock exam. Hardware and software issues were addressed by the relevant technical team.
Likewise, invigilator requirements (such as access to Google Meet & LMS accounts) were appraised and issues, if any, were ameliorated by one-to-one consultation with each invigilator with the help of an in-house technical team.
Technical requirements:
1. Examination Venue: IT Lab 1A and 1B with the capacity of 50 desktop Personal Computers (PCs) in each laboratory.
2. Operating System: Windows edition 10 Home Single Language.
3. System: Desktop PC setup with earphones.
4. Processor: Intel(R) Core (TM) i506400CPU@2.70GHz 2.71Ghz, 4GB RAM & 64-bit operating system.
5. Internet Specification: A minimum speed of 500 Mbps through LAN cable.
6. Software Specifications:
a. Google Meet: A video-communication portal developed by Google (G-suite education license, compatible on android and iOS platforms for online streaming, recording and storage.
b. Learning Management System (LMS): Moodle 3.7 version, managed by an internal professional team.
Non-technical requirements:
1. Invigilators’ requirements:
2. Students’ requirements: (eligibility criteria)
a. Examination registration (attendance, fees clearance, identification card, etc.).
b. Laptop/desktop (with camera enabled).
c. Internet data pack of a minimum 10GB (unlimited data pack was recommended).
d. Access to a university allotted online Google student account for Google meet.
e. Access to university allotted online LMS.
After the provisional approval of processes of online examination design plan, it was thoroughly tested by the exam and IT technical team for its technical viability, credibility and reliability, following which the final version of this exam model was approved by the relevant university boards.
STEP 3: Testing the system
Students were briefed on all enquiries based on their feedback from Step 2. Students were also instructed on the conduct of the Mock Online Examination.
Pattern of examination:
• Use of LMS as the examination tool.
• Proctoring was done by live visual streaming of students taking their examination by desktop/laptop and mobile devices through Google Meet with simultaneous window display of the question paper in the LMS platform.
• Question components included single best answer, short essay question (SEQ), and case-based question (CBQ).
• Duration: Each Single Response Answer (SRA) was allotted 90 seconds, SEQ/CBQ – 15 minutes per question.
• The response method included checking the correct box for the options given under SRA and typing the answer in the text box provided for SEQ/CBQ.
• Examination rules, regulations and SOPs were posted as self-read material on a specific LMS examination folder, which was made available to both staff and students.
Training of invigilators:
This process started with briefing on the practical training followed by hands-on experience in developing the necessary skills in the software and all the relevant platforms. The training was to familiarize the use of Google Meet, such as opening an account, creating single and multiple Google Meet session links with association of time in the Google Calendar, recording of Google session links, setting up of single and multiple links audio & video communication with students, sharing of these URLs links via email to students, and extraction of the recorded video of the Google session to the database. All invigilators were familiar with the use of LMS for examination purposes as these were used frequently during the teaching-learning activities.
STEP 4: Implementation of the mock exam
After establishing the Google Meet connection with the invigilators, the students were requested to share their entire screen with the invigilators. Upon approval from the invigilators, students were instructed to open the examination folder in the LMS. Following instruction from the invigilators, students commenced reading the examination regulations and SOP for the online examination. At the stipulated time, students were asked to open the examination question paper using a password provided individually on students’ WhatsApp. At the conclusion of the examination, students cross checked to confirm that all questions have been attempted and, with the permission of the invigilators, ensured that the submit button had been clicked to submit the answers and exit the platforms, thus marking the end of the examinations.
IT Lab Workstation Design Plan: Each invigilator was accommodated in a climate controlled, ergonomically structured IT-Lab observing a physical distance of one meter. As shown in Figure 2, individual invigilators were assigned an internet (LAN connection) enabled two (2) desktop PCs and two (2) headphone sets (ear lobe covered). Each PC had three students’ Google Meet link browsers with full screen shared depicting LMS page downsized to fit the PC screen. Thus, individual invigilators have in total six students’ Google Meet links, three each in two PCs. All information were stored in a secure server and later retrieved at the data centre as and when necessary.
Question paper setting in LMS:
1. Examination question papers were made available in a specially created examination folder in the LMS.
2. Question papers had pre-determined starting and ending times (after ending time, students could no longer attempt the questions).
3. Students were allowed to scroll forward and backward in the question grid (i.e., they didn’t have to go by a sequential order but on a random basis.
4. Questions and the distractor options are shuffled for each student.
5. LMS setting ensured that the question paper cannot be downloaded.
6. Students have to key in answers in the question paper in the space provided in the LMS.
Invigilation Mode: According to the time scheduled in the examination timetable, the invigilators reported to the venue and Chief Examination Invigilator for the day, and took charge of their examination workstation as per the duty allocation by the examination team and confirmed the workstation readiness as per the training instructions. Based on the student list allotment for each invigilator, which is pre-determined by the examination team (one invigilator is assigned to six students), the invigilator sent a separate student-specific Google Meet link for every student through email. The students logged in using this link and established the connection with the invigilator 30 minutes in advance of actual examination starting time. This was to ensure that any technical issues are sorted before the start of the examination. Invigilators then informed the students regarding the video recording of the session and started recording the Google Meet session followed by preparation of the student as per the prescribed SOPs as summarized below.
1. Students will display their ID cards/Examination hall tickets.
2. Students will show their surrounding (360 degrees) area including the desk through desktop/laptop camera by rotation of the camera to confirm that there is no reading materials/other electronic devices (apart from the permissible mobile phone) or any other persons in the space identified by the student taking the examination (Figure 3 Student’s Work Station). Students are informed that the invigilators are at the liberty to make such inspection at any point of time when there is suspicion aroused by the student’s activity.
3. Students are directed to arrange their seating positions such that they face the wall on two sides (Figure 3).
4. Students are directed to place their mobile phone with camera turned on and connected all through the examination on to the wall on their left or right side depending on their seating arrangements (Figures 3 and 4).
5. Students are advised to keep their session audio unmuted throughout the examination period.
6. Students will now be directed to open the LMS in another browser, and share this browser with the invigilator and open the examination folder and start reading the examination hall regulation and SOP for online examination under the supervision of the invigilator for 30 minutes.
7. Students will now be directed to open the online question paper as per the allotted examination starting time with the password for the question paper shared to individual students five minutes before the examination starting time.
8. Students will be advised to start attempting the question paper in real time. They are also reminded that the question paper is time bound (i.e., it will open and close at a pre-determined time setting), and no downloading is allowed.
9. Six students will be proctored by one invigilator for the entire period of the examination.
10. Students will be advised to raise their hand (physically) in front of the camera in case any trouble shooting is required to notify the concerned invigilator. The invigilator would then assist them individually without disturbing the other students (mute all other students except the one with issues) as each student is connected to the invigilator with specific link to avoid overlap.
11. Students are advised to contact their invigilator immediately through their WhatsApp video call/normal call in case they have issues with their PCs/internet connection. They are also advised to be in touch with their invigilator until the issues are addressed by the examination/technical team.
12. Students are advised that, as per the online examination rules, if the student loses internet connectivity or experiences any other technical issues that cannot be rectified within 15 minutes, the student will have to take the same exam on another day.
13. Students who wish to use the washroom will have to notify the invigilator, as explained in Step 8 above.
14. Students are advised to focus on the screen as much as possible with reasonable humane excuses, but if the invigilator detects suspicious eye movement of the student, the invigilator can ask the student to display the surrounding, as explained in Step 2 with warning, and three such recorded warnings can be considered as exam irregularity.
15. Students are notified of the last 15 minutes before examination ending time.
16. Students are directed to click on the submit button in the LMS page after they have completed their examination.
17. Students are advised to exit first LMS and then Google Meet session when the session recording stops.
In the event that the invigilator needs to leave his\her workstation, the invigilator must notify the Chief Examination Invigilator for the day so that a substitute invigilator could be arranged for that period. The invigilator will end the session after affirming that all events have been completed and send the recorded video link for each student for documentation and storage to the Chief Examination Invigilator.
Invigilators will be required to sign the Examination Oath Declaration form before commencement of the examination activity. The Chief Examination Invigilator, together with the Examination Coordinator and Examination Team, will supervise the conduct of the examination.
Security check: The invigilators were instructed to check for screen and cursor movements. The movements observed on the screen were expected to be active indicating the active use of the cursor. The signals to indicate full screen sharing/student eye movement/LMS and Google Meet log were stringently observed thus providing evidence on log activity during the examination time for each student.
STEP 5: Student mock/trial examination and proctored online examination invigilation feedback
After completing their mock examinations, students were encouraged to complete the pre-determined Google Survey Form to share their experiences, difficulties and any suggestions for improvement. The Google Survey form can be found under Extended data (Mohanraj, 2022).
Similarly, after completing the online mock examinations, invigilators are encouraged to complete the pre-determined Google Survey Form to share their experience and difficulties, how they handled issues (if any) and finally, any suggestions for improvement.
Separately designed Google Survey Form links are distributed via email to both students and invigilators.
STEP 6: Students’ and invigilators’ briefing on post mock/trial examination (report from feedback 1)
Students were briefed (arranged by online Google Meet session) on the overall conduct of the mock examinations (do’s and don’ts emerging from the mock examinations) in line with the invigilators’ inputs of proctoring students sensitized with technical short falls observed (if any) during the examination, technical guidelines, and steps to improve the situation, were recommended. Concerns raised from the students’ feedback will be looked into and addressed accordingly.
Invigilators were also briefed (arranged by online Google Meet session for briefing) on the overall conduct of the mock examinations. Students’ feedback on the examination conducted and proctoring issues (if any) were raised, addressed and rectified. Concerns raised from invigilators’ feedback were shared among all invigilators and solutions discussed.
Taking into account students’ and staff’s feedback on the two mock examinations, the examination department will address and correct all issues. During the Professional Examination, no technical issues were encountered.
STEP 7: Students’ and invigilators’ briefing on the final proctored online examination
Students were reminded of the Examination Regulation and SOPs for the final online examination. Students were made aware of the changes incorporated into the examination conduct based on feedback and asked to adhere to the dress code during the online examination as per the examination rules and regulations. Invigilators were briefed on the overall proctoring guidelines and changes incorporated thereof, based on the experience and inputs from students’ and invigilators’ feedback from the trial runs of the mock examinations.
STEP 8: Student final examination and proctored online examination invigilation
According to the time scheduled in the examination timetable, upon fulfilment of all online examination prescribed requirements, the final online examination was conducted as explained in step. 4 and with all ground experience endured and considering the feedback from both students and invigilators during the trial mock examination.
The invigilators will check with the Chief Examination Invigilator, who will ensure that all students have attempted and submitted their examination answers correctly in the LMS by cross-checking with the technical team (LMS) before releasing the students. The Chief Examination Invigilator, together with the invigilators, will compile the invigilation report at the end of the examination, to be submitted to the Examination Unit.
STEP 9: Student final exam and proctored online examination invigilation feedback
Students and invigilators, after completing the proctored online final examinations, are encouraged to complete the pre-determined Google Survey Form to share their experiences, difficulties and any suggestions for improvement. Separately designed Google Survey Form links are distributed via email to both students and invigilators. All feedback from both students and invigilators are processed and analysed. Output from the analysis paves the way towards amending the online examination regulation SOPs. This marks the end of the proctored online examination process.
Our process has been tested to address the possibility of malpractice by using the following approach:
1. Use of two video cameras: The use of video feed from the student’s phone and laptop gave us a wide angle to view his/her working space. This prevented the student from using any paper or technology-based information as a malpractice tool.
2. Monitoring student’s eye movement and keyboard activities: The use of split screens ensured that every time the student looked away from the screen, he/she was expected to look down to type his/her answers. Whenever the student’s keyboard was in use, the content typed would be visible to the invigilator to view on the LMS Examination page. The invigilator monitored the student’s gaze and also scrutinized his/her keyboard activity. Appropriate warning would be given if change in his/her point of gaze was observed and/or if the keyboard was found idle for a considerable amount of time to raise any suspicion of malpractice.
3. Preventing third party verbal assistance by using the sound system: The students were instructed to keep their microphones on their devices on at all times. This was to ensure that there was no verbal assistance from anyone in the vicinity of the candidate.
4. Preventing use of digital or paper-based aids by using desk and surrounding check protocols: The students were instructed that anytime the invigilator’s suspicion was credibly aroused, they would be requested to use their camera to show their desk space and the surrounding environment. This would improve the invigilators confidence and freedom to enforce quality invigilation.
5. Preventing impersonation by using LMS Logs: Using IT support, we were able to monitor the number of devices our students used to log on to their LMS page. This was to ensure only one device was used in the LMS Examination page activation. This ensured that there was no ghost writer who could take the examination on the student’s behalf. Along with student gaze and keyboard activity monitoring system, the process ensured that impersonation was impossible on this online examination platform.
Taking into account all these measures, we further had the confidence in the professionalism imparted in our curriculum that is regarded to be essential by many researchers (Ludmerer, 1999, Jaiprakash Mohanraj, 2020), which are tested in such circumstances.
The proposed proctored, synchronous, closed book, online examinations for barrier examinations were conducted for Year 2 and Year 5 MBBS students from a Malaysian Medical and Healthcare University. We present the findings from the Year 2 Professional MBBS Examinations using the above proposed system. As indicated in Tables 1 and 2, 52% of the student population were women and within this cohort, 52% were international students. The students were between the age group of 18 to 21 years. The exams were proctored by twenty-six lecturers who were aged between 35 to 70 years old. As most of the Teaching and Learning Activities (TLA) activities were conducted on LMS and by using Google Meet during the online session, both students and lecturers were comfortable and welcoming of the use of this new system. All the students took their examinations from the comfort of their homes or other preferred locations. It is worth noting that all the international students (from over eight countries) participated in this examination from their home countries. The full raw data can be found under Underlying data (Mohanraj, 2022).
Test for reliability: Table 3 indicates the mean score of the cohort in all their continuous assessment exams that were conducted face to face and the mean score for the Professional exam conducted online. A test for reliability shown in Table 4 indicates a Cronbach alpha score of 0.714 with indicates that the students’ performance across all these exams were reliable and consistent. The results indicate that students’ performance in this online examination is consistent with all their previous continuous assessments held earlier in their academic year.
Mean (%) | SD | N | |
---|---|---|---|
EPI(F2F) | 68.80 | 10.55 | 158 |
GIN(F2F) | 69.33 | 10.95 | 158 |
MSK(F2F) | 67.36 | 10.93 | 158 |
UIR(F2F) | 60.98 | 10.24 | 158 |
NVS(F2F) | 77.31 | 8.85 | 158 |
RPD(F2F) | 77.20 | 7.50 | 158 |
EDM(F2F) | 70.50 | 11.59 | 158 |
PRO(OPS) | 58.40 | 11.02 | 158 |
Cronbach's alpha | Cronbach's alpha based on standardized items | No of items |
---|---|---|
0.712 | 0.714 | 8 |
Test for validity of method adopted to conduct online examination: To test the extent to which this assessment tool measures what it claims to measure, we used the following tests.
Face validity: Students and staff perception were taken as feedback after the exam. The results indicate that both staff and students were quite satisfied with the process, its convenience and showed a significant amount of confidence in this system. Table 5 summarizes the students’ opinions regarding these exams.
Content validity: Since all the measuring tools used in the face-to-face exams were used in this online exam, none of the assessment strategies from any of the previous examination were altered or modified. This exam tested the delivered curriculum in depth and range which was supported by positive reviews by independent external examiners.
Criterion validity: We tested for correlation between the continuous assessment and the online Professional exam results obtained by this cohort of student. The results indicated in Table 6 shows that, except for the GIN block results, all the other block performance were positively correlated with a significance at the level of 0.01. These results indicate that this online examination method yielded results that reflect the same construct of student’s performance in continuous assessment. Figure 5 also indicates that the performance of this cohort of students, in terms of pass percentage, was within the range of previous performance observed on the last decade.
EPI(F2F) | GIN(F2F) | MSK(F2F) | UIR(F2F) | NVS(F2F) | RPD(F2F) | EDM(F2F) | ||
---|---|---|---|---|---|---|---|---|
PRO (OPS) | Pearson | 0.431** | 0.064 | 0.313** | 0.562** | 0.472** | 0.289** | 0.454** |
Sig. | 0.000 | 0.427 | 0.000 | 0.000 | 0.000 | 0.000 | 0.000 |
The proctored, synchronous, closed-book, online test described above has been demonstrated to be a trustworthy and legitimate procedure that may be utilised to administer any high-stake exams. With a major portion of 2020 and 2021 affected by COVID-19, medical education has been compelled to shift to the online delivery of content and thus arose the need to find a reliable online method of assessment of the curriculum. Currently, there are varying practices in many schools and universities that have adapted online assessment tools to fit their purpose; however, there are no standard guidelines, reliable tools or methods that have been proven to be reliable alternatives in the education sector. Thus, in this paper, we demonstrated that our suggested examination procedure is an effective instrument for assessing the curriculum and is comparable in reliability, efficiency and convenience of the time-honoured face-to-face mode of assessment. Additionally, we propose that this examination method be utilised in conjunction with assignments to assess students' performance in various distance learning modules.
One of the most advantageous features of our suggested online testing approach is its low cost. We utilised the current Learning Management System (LMS), which is a free Moodle-based system, in conjunction with the complimentary Google Meet account. The majority of institutions of higher learning have implemented a curriculum delivery method that is heavily reliant on an IT-based support system. This translates into the usage of LMSs for sharing learning materials and administering exams, as well as the use of computer laboratories for administering Computer-Assisted Learning. We have proven that our suggested approach may be used to administer a successful online test utilising the existing technology and essentially at no additional expense.
Our online, proctored examination was very well received by both teachers and students, with particular emphasis on the simplicity with which this innovative approach could be adapted. The student, being tech-savvy, had no trouble understanding the information and following the protocol. Digital divide has been observed between students and lecturers by many researchers (Mohanraj et al., 2019, Sabqat and Khan, 2019), and this seems to be growing with time. In this scenario, our success was with the fact that even with a fair number of the examiners involved there were digital novice and digital immigrants, there were no issues in grasping this technology dependent process. One factor that facilitated the universal acceptability is that there was no requirement to learn a new software, as we used familiar tools and tweaked it to our purpose. Researchers have shown students are subjected to stress in their academic journey and examination was one of the major contributing factors (Sohail, 2013, Fares et al., 2016) that could determine their performance. Thus, our effort was to provide a system that does not add any additional stress on the student’s process of taking their exams, which was achieved by adapting well known and familiar platforms integrated into the system.
The limitations in our systems included the necessity to achieve a good student to staff ratio. In a traditional face-to-face environment, the common practice is to assign one staff member to every 20 students; however, in our system, we used one staff member for every six students. We understand that, the lower the staff and student ratio, the better the quality of invigilation. This could be mitigated by engaging part-time invigilator who could be specifically trained in this exercise. Secondly, the strength of the internet connectivity for both the students and invigilators is the backbone of conducting a smooth online examination. This could be hampered due to unforeseen issues or lack of student preparation (with regard to the data package availability). Lastly, any technology dependent system is as good as it’s infra structure and the support staff that maintain it; we recognise that this can be a short coming is some educational institutions. However, since we propose to use only the existing system, a good internet connectivity in the institution with an up-to-date operating system is all that is required to conduct this examination. We recommend that this examination system to be implemented among multiple cohorts both with a university and across different university to improve its reliability and validity.
The suggested proctored, synchronous, closed book, online examination can assist in the transformation of manual ways of performing ongoing assessments and high-stake examinations on this online platform for remote learners, which is especially important in these trying times. This system has a high reliability index and repeatability, guaranteeing that the standard and quality of a face-to-face examination are fulfilled in every way.
This method is extremely cost effective because it does not necessitate the purchase of any additional hardware or software. This form of online examination may be performed at any educational institution with their existing IT support, good internet access, and a resourceful IT support team. Furthermore, because no new software or hardware is utilised, no substantial training/workshop is necessary for invigilators or students to comprehend and operate this method.
Thus, we suggest that this proctored, synchronous, closed book, online examination system is reliable, efficient, effective, economical and convenient and recommend it to be used in all the educational sectors that engage in student learning.
Figshare: Data for Repository.xlsx. https://doi.org/10.6084/m9.figshare.20024183.v2 (Mohanraj, 2022).
This project contains the following underlying data:
This project contains the following extended data:
Data are available under the terms of the Creative Commons Attribution 4.0 International license (CC-BY 4.0).
• The corresponding author is responsible for ensuring that the descriptions are accurate and agreed by all authors.
• All the authors were involved in conceptualization and validation of this study.
• Jaiprakash Mohanraj (JPM), Ivan Rolland Karkada (IRK) were involved in data collection & resources management.
• JPH, Rusli Bin Nordin (RN) & IRK were involved in writing the original draft, review and editing process.
• JPM & IRK were involved in Project administration, visualization and writing of the original draft.
• JPM, RN & AFS were also involved supervision.
We would like to express my deepest appreciation to the management of MAHSA University in providing us access to the examination and other necessary support that helped us to complete this research work.
Views | Downloads | |
---|---|---|
F1000Research | - | - |
PubMed Central
Data from PMC are received and updated monthly.
|
- | - |
Is the rationale for developing the new method (or application) clearly explained?
Yes
Is the description of the method technically sound?
Yes
Are sufficient details provided to allow replication of the method development and its use by others?
Yes
If any results are presented, are all the source data underlying the results available to ensure full reproducibility?
Yes
Are the conclusions about the method and its performance adequately supported by the findings presented in the article?
Yes
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Digital education, higher education, online proctored exams
Is the rationale for developing the new method (or application) clearly explained?
Yes
Is the description of the method technically sound?
Yes
Are sufficient details provided to allow replication of the method development and its use by others?
Yes
If any results are presented, are all the source data underlying the results available to ensure full reproducibility?
Partly
Are the conclusions about the method and its performance adequately supported by the findings presented in the article?
Yes
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Medical education, Anatomy
Alongside their report, reviewers assign a status to the article:
Invited Reviewers | ||
---|---|---|
1 | 2 | |
Version 1 07 Mar 23 |
read | read |
Provide sufficient details of any financial or non-financial competing interests to enable users to assess whether your comments might lead a reasonable person to question your impartiality. Consider the following examples, but note that this is not an exhaustive list:
Sign up for content alerts and receive a weekly or monthly email with all newly published articles
Already registered? Sign in
The email address should be the one you originally registered with F1000.
You registered with F1000 via Google, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Google account password, please click here.
You registered with F1000 via Facebook, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Facebook account password, please click here.
If your email address is registered with us, we will email you instructions to reset your password.
If you think you should have received this email but it has not arrived, please check your spam filters and/or contact for further assistance.
Comments on this article Comments (0)