Keywords
QVR, quality variance reports, monitoring and evaluation, hospital quality, key performance indicator, Microsoft SharePoint, Microsoft Excel
Background
Establishment of a systematic way of measurement and utilization of indicators for improvement is one of the most challenging issues in monitoring and evaluation of indicators in healthcare settings.
In realizing these fundamental challenges, we designed a monitoring and evaluation system incorporating a hospital-wide quality variance report (QVR) system using linked Microsoft Excel® spreadsheets on Microsoft SharePoint®.
Methods
Indicators were determined at the departmental/unit level in line with the institutional goals, departmental functions, quality expectations, inputs/outputs, clinical priorities, compliance to policies/procedures/protocols/guidelines/pathways as well as in response to gaps in service delivery picked during root cause analyses.
The sample design was determined in accordance with the characteristics of the population. Drawing of sample units was done using a simple random sampling technique without replacement or systematic random sampling.
The indicator’s monitoring was enhanced visually by allocating colour codes based on performance across the months and quarters.
The action plan tab consisted of a platform that aids in documenting corrective actions arising from the performance reviews.
Results and discussion
The QVR reporting system ensured a standardized format of monitoring throughout the institution with a reduced turnaround time from data collection to analysis. Further, continuity of the monitoring and evaluation (M&E) system was guaranteed even if an individual left the institution.
The analysis of the QVR allowed hospital-wide trending on cross-cutting indicators with consequent ease of communication to multiple stakeholders.
The automation has saved time and increased accuracy which has enhanced credible engagements during quality meetings.
Conclusions
Use of this system greatly enhanced quality performance monitoring in the hospital, identification of major bottlenecks that warranted hospital-wide projects or departmental-level projects. The QVR system enhanced the efficiency and accuracy of quality monitoring from data collection through to performance reviews. The QVR structure allows for customized development of an M&E database application software.
QVR, quality variance reports, monitoring and evaluation, hospital quality, key performance indicator, Microsoft SharePoint, Microsoft Excel
We have made several revisions based on the reviewers' comments:
Title: We made a small revision to the title for better alignment with the content.
Abstract: The abstract has been revised for clarity and completeness.
Introduction/Background: We restructured this section by relocating paragraphs to improve flow and added more information to enhance clarity, particularly about the paper's focus as a methodology paper rather than a research study article. Additional details have been included to ensure comprehensiveness.
Methods and Results: These sections have been reorganized, placing paragraphs in the most suitable locations and adding more details to clarify the innovation's purpose and how it addresses the identified gap. Additional details have been included to clear any confusion and ensure comprehensiveness.
Discussion and Conclusion: These sections have been restructured to maintain coherence and connectedness.
See the authors' detailed response to the review by Ahmad Aburayya
See the authors' detailed response to the review by Khara M. Sauro
Monitoring and evaluation encompass a systematic establishment of indicators, data collection, and analysis as well as periodic reviews to identify suitable action plans for improved performance of ongoing processes (Fajardo Dolci, Aguirre Gas, and Robledo Galván 2011). Indicators refer to specific metrics that enable the evaluation of goals’ and objectives’ achievements, they are specifically referred to as key performance indicators (KPIs). It is imperative that the KPIs are SMART (specific, measurable, achievable, relevant and time-bound) and linked to inputs, activities, outputs, outcomes, and goals (Kusek and Rist 2004). A profound monitoring and evaluation system ought to incorporate mixed research approaches (Bamberger, Rao, and Woolcock 2010). All relevant stakeholders’ involvement is necessary in the determination of indicators to be tracked (Emberti Gialloreti et al. 2020, Malone et al. 2014, Diallo et al. 2003, Guijt 1999). Participatory monitoring and evaluation involve the incorporation of all the stakeholders in all stages and processes of the M&E system, and the engagement helps in the determination of the way forward and what modifications are needed for better M&E (Izurieta et al. 2011, Estrella et al. 2000, Guijt 1999, Estrella and Gaventa 1998).
The most challenging issue in monitoring and evaluation of indicators in healthcare is the development of a systematic way of measurement and utilization of indicators for improvement. Development of such a system that incorporates variance reporting requires concerted effort and inclusion of each team member (Brown and Nemeth 1998). There are myriad challenges to the design, development, and implementation of an effective M&E system; such as stakeholder buy-in, knowledge gap, time constraints, logical frameworks, resources available, and the validity of data, sampling techniques, and indicator measurements (Emberti Gialloreti et al. 2020, Anderson et al. 2019, Mthethwa and Jili 2016, Cameron 1993). Data quality challenges are also experienced when monitoring and evaluation systems compile information directly from Hospital Management Information Systems (HMIS) (Nshimyiryo et al. 2020). Emberti Gialloreti et al. (2020) concluded that multiple stakeholder involvement is necessary for a successful and timely system in the collection and monitoring of health information. Participatory M&E systems promote efficiency, and relevant and effective processes leading to an elaborate corrective action (Guijt 1999).
Similar challenges were observed in the quality monitoring and evaluation processes at the Aga Khan University Hospital, Nairobi. Key among these were: i) Delayed reporting of quality indicators was commonplace due to transmission of data over multiple platforms such as email, PowerPoint presentations, and so on, most of which required multiple actors and human-to-human interaction; ii) The selection of quality improvement projects and indicators was mostly based on institutional-level decisions. The involvement of unit-level employees in deciding the kind of quality indicators to monitor or decisions on how to monitor them was minimal. The attention in unit meetings allocated to evidence-based quality improvement initiatives was limited in comparison to that given to purely operational matters. Only the mandatory indicators were monitored especially those determined by the board and the hospital leadership, which was commonly from the clinical units, and very few of non-clinical units participated; iii) A participatory approach to quality improvement was limited since data and results were not easily accessible except to the data provider at the unit level, their unit head, and the quality department. There was no platform where employees in a unit would track their performance. Data and results were only shared via emails and hence had a circulation limited to those on the mailing lists; iv) Multiple versions and copies of the same data were often held in multiple locations and updated at different times. Often the department at the source of the data had the most current data while data held at the institution-level would lag behind; and v) Data backup was inconsistently done and was not leveraged on the institution’s ICT infrastructure. As such the institution was prone to loss of quality monitoring data if for example information held in a particular computer got corrupted.
The challenges described above proved to be a stumbling block to establishing and monitoring quality improvement initiatives and towards achieving a data driven decision making approach to quality improvement. As such, the quality department team at Aga Khan University Hospital, Nairobi set out to create a reporting platform that would serve as a ‘single source of truth’ accessible to all stakeholders from department level to the top leadership. In addition, it would aid the institution to maintain and sustain the quality accreditation status earned a few years previously i.e. the Joint Commission International Accreditation (JCIA). The QVR system started with basic features, which were progressively enriched with additional functionalities to tackle emerging challenges and capitalize on opportunities encountered during the monitoring and evaluation of quality indicators. This iterative refinement process ultimately led to the development of a comprehensive quality monitoring system.
This article provides an overview of the KPI monitoring system developed and implemented within the Aga Khan University Hospital, Nairobi. It outlines the comprehensive rollout of this system throughout the entire hospital and discusses the significant improvements observed following its implementation. The primary aim of this paper therefore is to offer a thorough overview of the QVR quality monitoring system innovation. Additionally, it aims to: describe the challenges encountered during its development and the strategies employed to overcome them. And finally, to highlight the accomplishments resulting from the successful implementation of the QVR system.
To guide the development of this methodology paper, we followed the F1000Research template and utilized the revised Standards for Quality Improvement Reporting Excellence (SQUIRE 2.0) (Goodman et al. 2016). The methods section includes a general introduction on developing the quality monitoring innovation, followed by detailed explanations of various features. These include how indicators were determined, sampling approaches, key features of the QVR tool, and data security and backup procedures.
To overcome the fundamental challenges identified in the previously existing M&E structure, we designed and developed a monitoring and evaluation system incorporating hospital-wide key performance indicators. We named the system; the quality variance report (QVR) system, which is developed by use of linked Microsoft Excel 2010 (Microsoft Excel, RRID:SCR_016137) spreadsheets on a Microsoft SharePoint 2010 on-premises (intranet). SharePoint as a Microsoft product has features that are updated in every new release. Among the functions available are; web content management, social media functionality, and search features, among others (Perran et al. 2013). The utilization of SharePoint the fosters centrality of online access to collaborative information (Ennis and Tims 2012). Knowledge sharing via various media is possible through SharePoint (Weldon 2012). A well-constituted SharePoint implementation strategy enables an institution to enhance collaboration, communication, and storage (Diffin et al. 2010) and in addition, enables flexibility of spreadsheet data linkage (Prescott et al. 2010).
This was an improvement on the previous KPI’s monitoring process which entailed submission of hard copies of filled data collection tools to the monitoring and evaluation unit. This was a time-consuming process confounded with challenges, including lost forms, late submission, illegible handwriting, and transcription errors. In a study conducted by Jordans et al. (2019), illegible information was considered as incorrect which was similar to our experiences. Automation of data collection tools was one way our project intended to reduce the challenge of illegible handwriting.
The previous system was heavily dependent on individual quality focal points from the departments that maintained data on standalone computers and data were shared by use of emails. This presented another challenge during staff turnover or cases where some data and related literature would be lost if the machines crashed or the data got accidentally deleted. There was also a lack of standardization on how the data was maintained among various quality focal points. The Microsoft SharePoint server hence came in handy, albeit its utilization brought about the need for elaborate training to ensure maximum utilization (Dahl 2010). Initially, utilization of SharePoint was limited to document management, where staff working on policies updated the drafts in a dedicated folder on Microsoft SharePoint enhancing collaboration which was later extended to indicator monitoring.
The QVR monitoring and evaluation (M&E) system refers to a mechanism that enables KPI’s dashboards appearance and availability to be standardized and linked using spreadsheets on a Microsoft SharePoint server. The QVR adopted its name from the ability of the quality monitoring dashboards generated to show the difference between achievement and targets/benchmarks (variance). The quality indicators are gathered from the smallest unit level to the highest institutional level. The highest hierarchy of classification of the hospitals’ quality leadership includes institutional leadership, divided into; clinical and non-clinical departments that are branched further into various sub-sections and units. KPIs dashboards are aligned in the same order from the smallest unit to the highest in the echelon. A Microsoft SharePoint server anchored in the institution’s intranet configuration enables data capture and access through spreadsheets to be done centrally to everyone with rights of access. Rights of access are obtained from the institution’s Information Communication and Technology’s (ICT) administrator. The spreadsheets are linked allowing similar indicators to be grouped together at a higher level of departmental structure up to the institutional level which enhances comparisons. The linking of data from data-entry screens allows seamless real-time tracking of indicators. The spreadsheets are saved in folders which are arranged in line with departmental organograms, with sub-folders equivalent to all sub-units, the data entry sheets (templates) are placed in each of the unit folders. Figure 1 demonstrates how QVRs are linked from a unit level to institutional level. Several units comprise a department or a section/program of which several of these form a department. X in the illustration indicates the possibility of multiple available departments/units.
The indicators were determined at the departmental/unit level in line with the institutional goals, departmental functions, quality expectations, inputs/outputs, clinical priorities, compliance to policies/procedures/protocols/guidelines/pathways as well as in response to gaps in service delivery picked during root cause analyses. For example, correct patient identification was chosen as a hospital-wide indicator in fulfillment of International Patient Safety Goal number one (IPSG 1) standard contained in the Joint Commission International Accreditation (JCIA) that the institution subscribed to. This indicator measures the proportion of correctly identified patients out of all the opportunities of identifying a patient monitored when delivering care. At least two unique identifiers as per the hospital policy were required for the process to be classified as correct. Data on this indicator was collected through observation. Further details on data collection and aggregation is portrayed in Figure 2 while a snapshot of how the end product is viewed is shown in Figure 3. The committees that prepare the quality plans and indicator measures are referred to as departmental quality improvement and patient safety (DQIPS) committees. Each department has a DQIPS with nominees of sufficient number to represent the core functions of the department. It is usually essential to incorporate all stakeholders in the formulation of indicator measures (Diallo et al. 2003). Each department has a quality advisor attached to it, from the quality department, who act as a required liaison between the department’s needs and quality expectations. The indicators from various departments are collated at the institutional quality improvement and patient safety (QIPS) committee and are then presented to the hospital leadership at the joint staff committee (JSC) for reviews and approval. Figure 2 portrays the data aggregation process flow we developed to guide the practice; the process begins with the identification of what needs to be measured; a review of the literature is undertaken to find out appropriate research questions and benchmarks if available (where benchmarks are not available internal targets are set up after the pilot phase), a research design is determined through which data collection tools are designed and developed, sample determination follows, data collection methods and analysis plan are established, tool validation is done during the pilot phase. When data tools and data collection teams are deemed appropriate, collection and processing of data are done, and trends are monitored over time. The frequency of data collection is dependent on the indicator being monitored; the data entry sheets are designed in such a way that data is collected continuously. The data is networked/mapped in an orderly manner where similar indicators are aggregated together at a higher level which leads to better monitoring (Abdi, Majdzadeh, and Ahmadnezhad 2019). For information to be useful, data gathering ought to be collected at well-defined frequency (Prennushi, Rubio, and Subbarao 2002, Guijt 1999).
Our sampling approach was done in accordance with the characteristics of the population. Chander (2017) avers that sample size should be determined by considering factors such as the design, sampling method, the indicator/outcome being measured, design effect and variance, power, level of significance as well as the level of precision desired for the indicator estimates. We incorporated these considerations in sample determination and availed a sampling guideline to the users of the QVR. In this M&E intervention, drawing of sample units was done using either a simple random sampling without replacement or systematic random sampling. Microsoft Excel provided users with functions such as random number generator (=RAND(),=RANDBETWEEN(x,y)) and an add-on such as ‘Data Analysis Toolpack’ which were used in randomization. Depending on their suitability, any of these randomization functions were employed in drawing of the sample units. A sample guideline was developed to help the users with easier implementation with assistance by a statistician.
Data collection tools were designed specifically to suit the measurement criteria of each indicator in a format that ensured easier entry and linkage with other indicators for results aggregation. The data entry sheets were created in user friendly formats so as to provide a platform for easier data entry, for instance, use of drop down lists where applicable, restrictions were also applied to ensure specified data were entered correctly in the cells. Customized error messages emerge if incorrect data type or format was entered. The excel functions in the data entry screens were locked for editing (using passwords that were only accessible to the M&E unit) to prevent alteration by users during data entry. Figure 3 portrays an example of a data entry sheet used to collect data on compliance towards the requirement for correct patient identification while administering treatment or when giving any other service to a patient. Columns B, G, H and J have drop down lists for easier data entry, all columns have restrictions such that only data in the specified format is entered, column E contains the medical record numbers of the patient sampled, column F provides a description of the procedure or the service audited, column J the category of staff observed, name of staff observed as well as that of the auditor is also captured. The data from the sheet, in which this snapshot is obtained, is aggregated into a dashboard by use of a formula separating the numerator (the opportunities complied to) against the total number of opportunities observed and a proportion of compliance is computed.
QVR is a workbook that comprises several sheets as shown in Figure 4. The tab named QVR contains the master KPI dashboard described under the QVR master sheet section. The Definitions tab contains a sheet that presents explanations about the indicators and how they are measured. The dashboard tab presents data in terms of absolute numbers as well as in terms of ratios, proportions and rates, and obtained data (linked) from the data entry sheets. An action plan tab provides the users with a platform for documenting gaps, root causes and the resulting action plans.
The QVR master sheet comprises various components. As demonstrated in Figure 5, the sheet has the first column containing an indication of the current year of tracking and the indicators measured, the second column defines how the indicators are measured. The third column contains either a target or benchmark corresponding to each indicator; a benchmark is derived from literature sources or from other hospitals’ accomplishments (whereas in the absence of these, internal targets are determined). The previous year’s average performance is presented to facilitate comparisons with the current year’s performance. Monthly performance is directly linked to data entry sheets and populates automatically as the data are entered. Quarterly performance has functions that compute the performance levels from the monthly data. The trends column uses sparklines which portray monthly trends from January to December. Variance indicates the percent difference between the benchmark/target and the quarterly performance. The annual column indicates performance from the beginning of the year up to the current date. Literature references provide the literature sources from which benchmarks are derived. Hazard vulnerability analysis provides rating metrics for the risk score based on the likelihood of risk occurrence as well as the consequences/severity of occurrence as per the guide obtained from the institution’s risk program.
The KPI’s monitoring is enhanced visually by allocating colour codes to various performances across the month and quarters. The conditional formatting feature available on Microsoft Excel is utilized to promote visual assessment of indicators. In the configuration, green portrays an indicator that has met benchmark or target set or exceeded. The colour red appears if the indicator measure’s level is below the benchmark/target. Amber appears if the performance is below benchmark but higher than the average performance of the previous year. White is present if for that particular period, the indicator is not applicable and finally the colour grey represents ‘grey areas’, where data is due but not yet populated. These entire colour codes are set to change automatically as the data is input into the linked data entry sheets. The final indicator measure on each QVR master sheet is the measure of timeliness of data provision expressed as a proportion of all the indicators which have data populated by the 5th of each subsequent month out of all indicators measured by the unit/department. The illustration of the colour code configuration is as shown in Figure 6.
Action plan tab consists of a platform that aids in documenting corrective actions arising from the performance reviews/evaluation. Reviews are made at least once a month and gaps listed down in the action plan’s sheet. The gaps are identified by checking the colour codes, the trends column as well as the variance columns (see Figures 5 and 6). If an indicator presents more reds compared to other indicators this suggests a gap that needs to be dealt with, similarly, if the trends column shows a poorly performing trend or a sudden change either upwards or downwards this presents an opportunity for root cause analysis. Finally, in the variance column, a greater percentage variance presents an opportunity for further action. Figure 7 portrays the components of the action plan tab. It has a serial number column, month, the indicator with undesired performance, gaps and why as identified through a root cause analysis, the action plan containing the corrective action, a timeline for closure, the individual responsible for follow-up, and review date as well as the additional comments. The process of evaluation hence takes both the quantitative component and qualitative one.
In order to ensure effective utilization of various components and features, training was provided to each new user. The previously taught users were intended to pass on the knowledge, skills, and competencies acquired to their colleagues within departmental units. Advanced training needs identified were provided by the Quality department through the M&E unit in collaboration with the ICT training unit. Gaining the right competencies enhances the success of the M&E. Successful training refers to the proper acquisition of competencies, skills, and knowledge that are imperative in achieving a working M&E system (Fletcher et al. 2014).
The QVR’s were monitored every beginning of the month i.e. by the 5 th of each month to ensure that all the requisite data were collected and entered in good time hence impacting on utilization of the results. A score was allocated every month based on the availability of data as a proportion of all the KPIs in each unit/department’s QVR. Similarly, utilization of various features such as determination and review of an action plan, was put into consideration during establishment of the best performing departmental unit of the year.
Before the implementation of the QVR reporting system, there was a major challenge in regard to data collection and sharing with the M&E unit for processing. Monitoring of consistency in data collection was a key priority at the point of the introduction of the QVR system. This was done by keeping data completion logs for each department. In a situation where a data provider was away, before the introduction of the QVR system, data would not be available until such a data provider resumed work and shared it. The situation was worse in cases of service interruption through either staff dismissal or resignation, since in some instances, a replacement would not be available immediately or the incoming staff might have been in need of a prolonged period to adapt. After the implementation of the QVR system, each unit head was required to be actively involved in the entire process. In addition, they were meant to involve their staff in the determination of the indicators, design of data collection and entry sheets as well as linking to the QVR spreadsheets on Microsoft SharePoint. The unit leaders were required to ensure availability of resources for continuous data collection and entry. Appointed staff from each unit were taught how to collect and enter the data which ensured continuity enough to withstand the effects of staff turnover.
Determination of Quality Improvement Projects: Each year the departments were meant to select a quality improvement (Q.I) project that would have a great impact on quality. The hospital departments in conjunction with the M&E unit were able to narrow down on quality improvement (Q.I) projects from a review of the trends in KPI performance. Hospital wide Q. I improvements were easily selected from the combined set of institutional indicators (institutional QVR). An example of this was the consulting clinics no-show project that sought to reduce the number of booked patients who did not appear for their scheduled clinics due to various reasons. The selected project was conducted in the year that followed in a bid to reduce chances of no-show cases.
Time saving in quality meetings: The use of the QVR has enabled quicker discussion and shorter quality meetings regarding monitoring of indicators. This is because the agenda gets focused around the salient results on the QVR dashboards. This improves efficiency and effectiveness in decision making (Izurieta et al. 2011).
Ease of data access: as mentioned previously, before rolling out the QVR system data were stored on local drives, and accessing the same was a laborious process of sending email requests and phone call follow-ups. The QVR system made data availability easy through the Microsoft SharePoint platform; any authorized staff could access the data from the comfort of their workstations.
Data processing: through the use of built-in formulae, data were processed in real-time with embedded visualizations enhancing the efficiency and usability of the results. This reduced the time lag of data processing as well as the workload on the part of the M&E unit.
Elimination of redundancy: The previous system was rife with instances of multiple copies of reports residing with different users as the review was not centralized and the reports were forwarded from one user to their seniors for reviews which at times same wouldn’t be reverted back. Often the quality reports would have variations due to editing. The QVR system now assures a single and most up-to-date copy of data sheets and reports that are widely accessible to all users.
Automatic backup: The Microsoft SharePoint server enables automatic backup of the input data as long as the internet connection remains uninterrupted.
The hospital’s Information Communication Technologies’ (ICT) department developed a framework of maintaining back-ups daily on a hard disk separate from the server hosting the SharePoint data. In addition, offsite archiving of the data happened monthly which enabled recovery of any loss, in case a user deleted an item erroneously. Every user underwent basic training on how to access and update items on the Microsoft SharePoint platform. A request for rights to access to the Microsoft SharePoint server by any new potential user was sent to the ICT in order for the access rights to be provided. This ensured coordinated control of who accessed at any time. The Microsoft SharePoint also indicates the access logs, thus the last person who accessed a certain document on the server can be determined. Once a new user was allowed access to SharePoint, passcodes for the files they need to access were provided and this ensured that only the relevant files were accessed by the appropriate SharePoint users. Access to SharePoint data was further enhanced by document-level control where each department set passwords for their own documents. The quality department M&E unit also instituted document protection passwords that prevented the structure of the QVR and the formulae contained in the worksheets from being altered. Data are not entered directly into the QVR but are fed in through primary data entry sheets with enhanced data validation restrictions ensuring only the required type of data is entered. As such, the QVR dashboards are accessible to user departments with passcodes that grant them read only rights.
This section provides a review of the accomplishments observed since adoption of the QVR as a hospital quality monitoring and evaluation system. The impact of the project was assessed based on the goal to improve the efficiency, effectiveness, reproducibility and sustainability.
Continuity of the Monitoring and Evaluation System: Adoption of the QVR system increased the KPI reporting since initial implementation; the initial year of implementation was 2017 where a total of 50 departmental units had functional QVRs which increased in 2018 and 2019 with 60 departmental units having functional QVR dashboards. The participatory approach adopted in M&E enabled communication of results which enhanced sharing of information within the departmental units, as well as across the departmental units. Guidance was provided on what information was meant to be shared during the multidisciplinary meetings, leading to increased motivation to participate due to clarity on the expectations (Guijt 1999). The QVR system was developed through Microsoft Excel formulae and linked across related units to the departmental level, several departments linked to form institutional QVR dashboard provides an essential blueprint for a customized database system. Due to the flexibility of Microsoft Excel formulae which can easily be handled, this provides a framework that can be adopted by others to improve automation through customized systems.
Ease of Data Visualisation and Digitization: One of the key indicators tracked through automation of a monitoring framework focused on Timeliness, Legibility and Completeness (TLC) of clinical documentation by various cadres of staff—namely doctors, nurses, and other allied healthcare workers. The documentation audit tool was digitized and the TLC results output fed into the QVR system. This eased the process of data collection & visualization through interoperability of the digitized TLC documentation audit tool and the QVR system. Thus sharing and discussion of audit results among all stakeholders for quality improvement was made efficient.. Further digitizationof various other indicators was planned. Testimonials from various staff, including departmental leaders, and unit staff highlighted their satisfaction with the transformation brought about by the QVR system in terms of quality monitoring and evidence-based decision making. Additionally, many health facility delegations from within Kenya and other countries, who visited the hospital for benchmarking, expressed their excitement about the QVR innovation. The success of this monitoring platform has also been shared in various quality improvement seminars and symposiums. We embraced the concept of small wins (Abimbola 2021) from the onset of developing the QVR. The small wins provided the impetus to continue enhancing the system to what it is today.
Reduced turnaround time from data collection to analysis: Utilization of the feature of the action planning increased from <10% in 2017 to 78.4% in 2018. A similar challenge had been experienced in Iran when a framework for M&E was being set-up, many indicators were noted to have stayed for long periods having not been updated leading to many gaps in health information monitoring (Abdi, Majdzadeh, and Ahmadnezhad 2019). A well-defined frequency of data gathering promotes the usefulness of information gathered (Guijt 1999).
Standardized format of monitoring throughout the institution: prior to the implementation of the QVR system, reports were prepared on standardized PowerPoint slides by respective departments with the assistance of the statistician. This was a time-consuming exercise since the data had to be provided via email and then analyzed. The QVR system provided a ready and standardized format of outputs that were not alterable and enhanced aggregation, tracking and comparisons which worked throughout the year after setting up. The creation of a standardized monitoring framework promotes comparability and usability (Abdi, Majdzadeh, and Ahmadnezhad 2019). It is also possible to obtain hospital-wide trends on cross cutting indicators: arising from the standardized format of reporting that was made possible by the QVR system it became possible to aggregate data on indicators such as compliance towards hand hygiene practice, patient identification monitoring, client feedback gathering, etc. which were monitored across many hospital departments.
Saving man hours: Prior to the deployment of the QVR system there was plenty of time spent in transcribing data from paper forms into the M&E database. There was also time lost in the collection and transportation of primary (paper) data collection sheets from the source departments to the M&E unit. The QVR system provides a platform of networked dashboards which allows continuous monitoring of performance any time, as data is continually populated. Data validation commands customized in the dashboards prevent erroneous data entry cases. Before implementation of the QVR system, a data collector would spend over three days entering monthly data which would increase the chances of erroneous entries, a situation that was rectified by the QVR system, where data entry happens every day at a proportion of an hour 10% (5 to 6 mins), this also further facilitates easier monitoring and decision making through trend analysis at any time of the month.
Ease of communication to multiple stakeholders: data on matters of quality is communicated to various stakeholders who require different levels of data granulation in order to make decisions. As described above, this was previously done by the use of Microsoft Excel then exported into Microsoft PowerPoint slides and this lengthened the process of critical analysis and action planning. The effectiveness of this was altered dependence on extra labour force and technical know-how in dealing with the multiple softwares involved. Currently through the use of QVR colour coded “traffic lights system” (Red, Green and Amber) trending and detection of low-performing indicators has been simplified allowing decision makers to quickly focus on undesirable trends for corrective action.
Monitoring and evaluation are fundamental in quality performance of a hospital through an elaborate system of establishment, measurement and assessment of key performance indicators over time. A quality monitoring system such as the one we developed which facilitates participatory development of indicators from the smallest unit or department to the highest hierarchy in the institution, ensures comprehensive coverage of quality monitoring. A spreadsheet program, e.g. MS Excel, despite its simplicity, allows for flexible linking and computation of the chosen indicators. The Microsoft SharePoint server platform enables users to access and update indicators with ease via intranet, made possible through linked spreadsheets. Pointers such as colour codes, sparklines, and benchmark-achievement variance computation allows for identification of performance gaps. Timely identification of these gaps enhances propitious reviews coupled with a well-designed standardized root cause analysis and action plan all-inclusive in the system. Use of this system greatly enhanced quality performance monitoring in the hospital, identification of major bottlenecks that warranted hospital wide projects or departmental level projects. The QVR system enhanced efficiency and accuracy of quality monitoring from data collection through to performance reviews/evaluation. This was demonstrated by the transformation the innovation brought to the hospital, revitalizing the quality monitoring and providing a platform for better evidence-based decision making. Development and setting up of the QVR system required seamless internet connectivity, a deeper understanding of Microsoft SharePoint server and spreadsheets program by the technical team, and a trained manpower for data entry and utilization of results. Due to flexibility of the spreadsheet programme and ease with which dashboards can be manipulated coupled with availability and low cost of acquiring the software, institutions mounting a framework of monitoring KPIs may adopt their use. The framework also provides an opportunity to understand how well various indicators can be measured and monitored. The system serves as a blueprint for a customized monitoring and evaluation database system.
1. Knowledge gap in technology, it took some time for some users to become acquainted with the Microsoft SharePoint access and utilization. This was overcome through continued training and having always an M&E and ICT staff member reachable to sort out any emerging issues.
2. Limited knowledge in Microsoft Excel usage; this was overcome through rigorous training between the quality department’s M&E unit and the ICT department’s training unit.
3. Staff turnover affected seamlessness with which data were captured. Some newly recruited staff encountered challenges before they were acquainted with the system.
4. Server downtime - during server downtimes access to the Microsoft SharePoint and documents saved therein was a big challenge until the server functioning was restored.
5. Limitation to access documents (essentially) Microsoft Excel files with data validation features or those which were password protected were not easily accessed using several browsers with the exception of Internet Explorer. This was sorted by setting the default browser to Internet Explorer for users’ easier access and creation of desktop shortcuts to the SharePoint folders.
6. Only a single user at a time could edit a document on Microsoft SharePoint at any given time, during this time other users could only access the document on “read only” mode. For this reason we created multiple data entry sheets to allow many users to continue doing data entry seamlessly and encouraging them to close the data entry screens as soon as they completed accessing or updating.
All data underlying the results are available as part of the article and no additional source data are required.
First and foremost we would like to thank the almighty God. Special thanks to each and every author who took part in the compilation of this paper. The quality department, ICT department as well as the entire fraternity of the Aga Khan University Hospital, Nairobi for their continued support since inception of the QVR system.
Views | Downloads | |
---|---|---|
F1000Research | - | - |
PubMed Central
Data from PMC are received and updated monthly.
|
- | - |
Is the rationale for developing the new method (or application) clearly explained?
Yes
Is the description of the method technically sound?
No
Are sufficient details provided to allow replication of the method development and its use by others?
No
If any results are presented, are all the source data underlying the results available to ensure full reproducibility?
No
Are the conclusions about the method and its performance adequately supported by the findings presented in the article?
No
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Business Management; Operation Management;Quality Management; Total Quality Management .
Is the rationale for developing the new method (or application) clearly explained?
Yes
Is the description of the method technically sound?
No
Are sufficient details provided to allow replication of the method development and its use by others?
No
If any results are presented, are all the source data underlying the results available to ensure full reproducibility?
No
Are the conclusions about the method and its performance adequately supported by the findings presented in the article?
Yes
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Quality Improvement, Implementation Science, Health Services Research, Administrative Data Analysis, Systematic and Scoping Review
Alongside their report, reviewers assign a status to the article:
Invited Reviewers | ||
---|---|---|
1 | 2 | |
Version 2 (revision) 25 Jun 24 |
||
Version 1 02 Sep 21 |
read | read |
Provide sufficient details of any financial or non-financial competing interests to enable users to assess whether your comments might lead a reasonable person to question your impartiality. Consider the following examples, but note that this is not an exhaustive list:
Sign up for content alerts and receive a weekly or monthly email with all newly published articles
Already registered? Sign in
The email address should be the one you originally registered with F1000.
You registered with F1000 via Google, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Google account password, please click here.
You registered with F1000 via Facebook, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Facebook account password, please click here.
If your email address is registered with us, we will email you instructions to reset your password.
If you think you should have received this email but it has not arrived, please check your spam filters and/or contact for further assistance.
Comments on this article Comments (0)