Keywords
QVR, quality variance reports, monitoring and evaluation, hospital quality, key performance indicator, Microsoft SharePoint, Microsoft Excel
QVR, quality variance reports, monitoring and evaluation, hospital quality, key performance indicator, Microsoft SharePoint, Microsoft Excel
Monitoring and evaluation encompasses a systematic establishment of indicators, data collection and analysis as well as periodic reviews to identify suitable action plans for improved performance of ongoing processes (Fajardo Dolci, Aguirre Gas, and Robledo Galván 2011). Indicators refer to specific metrics that enable evaluation of goals’ and objectives’ achievements, they are specifically referred to as key performance indicators (KPIs). It is imperative that the KPIs are SMART (specific, measurable, achievable, relevant and time-bound) and linked to inputs, activities, outputs, outcomes and goals (Kusek and Rist 2004). A profound monitoring and evaluation system ought to incorporate mixed research approaches (Bamberger, Rao, and Woolcock 2010). All relevant stakeholders’ involvement is necessary in determination of indicators to be tracked (Emberti Gialloreti et al. 2020, Malone et al. 2014, Diallo et al. 2003, Guijt 1999). Participatory monitoring and evaluation involves incorporation of all the stakeholders in all stages and processes of the M&E system, and the engagement helps in determination of the way forward and what modifications are needed for better M&E (Izurieta et al. 2011, Estrella et al. 2000, Guijt 1999, Estrella and Gaventa 1998).
The most challenging issue in monitoring and evaluation of indicators in healthcare is the development of a systematic way of measurement and utilization of indicators for improvement, development of such a system which incorporates variance reporting requires concerted effort and inclusion of each team member (Brown and Nemeth 1998). There are myriad challenges towards the design, development, and implementation of an effective M&E system; such as stakeholder buy-in, knowledge gap, time constraints, logical frameworks, resources available, and the validity of data, sampling techniques, and the indicator measurements (Emberti Gialloreti et al. 2020, Anderson et al. 2019, Mthethwa and Jili 2016, Cameron 1993). Emberti Gialloreti et al. (2020) concluded that multiple stakeholder involvement is necessary for a successful and timely system in collection and monitoring of health information. Participatory M&E systems promote efficiency, relevant and effective processes leading to an elaborate corrective action (Guijt, 1999). In realizing these fundamental challenges, we designed and developed a monitoring and evaluation system incorporating hospital wide key performance indicators. We named the system; quality variance report (QVR) system, and it is developed by use of linked Microsoft Excel 2010 (Microsoft Excel, RRID:SCR_016137) spreadsheets on a Microsoft SharePoint 2010 on-premises (intranet). SharePoint being a Microsoft product, it has new and exciting features in every new release, among the functions available are; web content management, social media functionality, and search features, among others (Perran et al. 2013). Utilization of SharePoint fosters centrality of online access to collaborative information (Ennis and Tims 2012). Knowledge sharing via various media is possible through SharePoint (Weldon 2012). A well constituted SharePoint implementation strategy enables an institution to enhance collaboration, communication, and storage (Diffin et al. 2010) and in addition enables flexibility of spreadsheet data linkage (Prescott et al. 2010).
This was an improvement on the previous KPI’s monitoring process which entailed submission of hard copies of filled data collection tools to the monitoring and evaluation unit. This was a time-consuming process confounded with challenges, including lost forms, late submission, illegible handwriting, and transcription errors. In a study conducted by Jordans et al. (2019), illegible information was considered as incorrect which was similar to our experiences. Automation of data collection tools was one way our project intended to reduce the challenge of illegible handwriting.
The previous system was heavily dependent on individual quality focal points from the departments who maintained data on standalone computers and data were shared by use of emails. This presented another challenge during staff turnover, or cases where some data and related literature would be lost if the machines crashed or the data got accidentally deleted. There was also a lack of standardization on how the data was maintained among various quality focal points. The Microsoft SharePoint server hence came in handy, albeit its utilization bringing about need for elaborate training to ensure maximum utilization (Dahl 2010). Initially utilization of SharePoint was limited to document management, where staff working on policies updated the drafts in a dedicated folder on Microsoft SharePoint enhancing collaboration which was later extended to indicator monitoring.
This article presents a description of the system that we designed and implemented at the quality department of the Aga Khan University Hospital, Nairobi and rolled out across the entire hospital. It started as a basic standard system design which was improved with various features over time to address the emerging challenges experienced in monitoring and evaluation of quality indicators.
The QVR monitoring and evaluation (M&E) system refers to a mechanism that enables KPI’s dashboards appearance and availability to be standardized and linked using spreadsheets on a Microsoft SharePoint server. The QVR adopted its name from the ability of the quality monitoring dashboards generated to show the difference between achievement and targets/benchmarks (variance). The quality indicators are gathered from the smallest unit level to the highest institutional level. The highest hierarchy of classification of the hospitals’ quality leadership includes institutional leadership, divided into; clinical and non-clinical departments that are branched further into various sub-sections and units. KPIs dashboards are aligned in the same order from the smallest unit to the highest in the echelon. A Microsoft SharePoint server anchored in the institution’s intranet configuration enables data capture and access through spreadsheets to be done centrally to everyone with rights of access. Rights of access are obtained from the institution’s Information Communication and Technology’s (ICT) administrator. The spreadsheets are linked allowing similar indicators to be grouped together at a higher level of departmental structure up to the institutional level which enhances comparisons. The linking of data from data-entry screens allows seamless real-time tracking of indicators. The spreadsheets are saved in folders which are arranged in line with departmental organograms, with sub-folders equivalent to all sub-units, the data entry sheets (templates) are placed in each of the unit folders. Figure 1 demonstrates how QVRs are linked from a unit level to institutional level. Several units comprise a department or a section/program of which several of these form a department. X in the illustration indicates the possibility of multiple available departments/units.
The indicators are determined at the departmental/unit level in line with the institutional goals, departmental functions, quality expectations, inputs/outputs, clinical priorities, compliance to policies/procedures/protocols/guidelines/pathways as well as in response to gaps in service delivery picked during root cause analyses. The committee that prepares the quality plans and indicator measures are referred to as departmental quality improvement and patient safety (DQIPS) committees. It is usually essential to incorporate all stakeholders in the formulation of indicator measures (Diallo et al. 2003). Each department has a quality advisor attached to it, from the quality department, who acts as a required liaison between the department’s needs and quality expectations. The indicators are then presented by the institutional quality improvement and patient safety (QIPS) committee to the joint staff committee (JSC) for reviews and approval. Figure 2 portrays the data aggregation process flow we developed to guide the practice; the process begins with identification of what needs to be measured; a review of the literature is undertaken to find out appropriate research questions and benchmarks if available (where benchmarks are not available internal targets are set up after the pilot phase), a research design is determined through which data collection tools are designed and developed, sample determination follows, data collection methods and analysis plan are established, tool validation is done during the pilot phase. When data tools and data collection teams are deemed appropriate, collection and processing of data is done, and trends are monitored over time. Frequency of data collection is dependent on the indicator being monitored; the data entry sheets are designed in such a way that data is collected continuously. The data is networked/mapped in an orderly manner where similar indicators are aggregated together at a higher level which leads to better monitoring (Abdi, Majdzadeh, and Ahmadnezhad 2019). For information to be useful, data gathering ought to be collected at well-defined frequency (Prennushi, Rubio, and Subbarao 2002, Guijt 1999).
Sample design is determined in accordance with the characteristics of the population. Specifically, sample size is determined by considering factors such as the design, sampling method, the indicator/outcome being measured, design effect and variance, power, level of significance as well as the level of precision desired for the indicator estimates (Chander 2017). Drawing of sample units is done using either a simple random sampling without replacement or systematic random sampling. Microsoft Excel provides users with functions such as random number generator (=RAND(),=RANDBETWEEN(x,y)) and an add-on such as ‘Data Analysis Toolpack’. Either of these are used during determination of the sample units. A sample guideline had been developed to help the users for easier implementation with assistance by a statistician.
Data collection tools were designed specifically to suit the measurement criteria of each indicator in a format that ensured easier entry and linkage with other indicators for results aggregation. The data entry sheets were created in user friendly formats so as to provide a platform for easier data entry, for instance, use of drop down lists where applicable, restrictions were also applied to ensure specified data were entered correctly in the cells. Customized error messages emerge if incorrect data type or format was entered. The excel functions in the data entry screens were locked for editing (using passwords that were only accessible to the M&E unit) to prevent alteration by users during data entry. Figure 3 portrays an example of a data entry sheet used to collect data on compliance towards the requirement for correct patient identification while administering treatment or when giving any other service to a patient. Columns B, G, H and J have drop down lists for easier data entry, all columns have restrictions such that only data in the specified format is entered, column E contains the medical record numbers of the patient sampled, column F provides a description of the procedure or the service audited, column J the category of staff observed, name of staff observed as well as that of the auditor is also captured. The data from the sheet, in which this snapshot is obtained, is aggregated into a dashboard by use of a formula separating the numerator (the opportunities complied to) against the total number of opportunities observed and a proportion of compliance is computed.
QVR is a workbook that comprises several sheets as shown in Figure 4. The tab named QVR contains the master KPI dashboard described under the QVR master sheet section. The Definitions tab contains a sheet that presents explanations about the indicators and how they are measured. The dashboard tab presents data in terms of absolute numbers as well as in terms of ratios, proportions and rates, and obtained data (linked) from the data entry sheets. An action plan tab provides the users with a platform for documenting gaps, root causes and the resulting action plans.
The QVR master sheet comprises various components. As demonstrated in Figure 5, the sheet has the first column containing an indication of the current year of tracking and the indicators measured, the second column defines how the indicators are measured. The third column contains either a target or benchmark corresponding to each indicator; a benchmark is derived from literature sources or from other hospitals’ accomplishments (whereas in the absence of these, internal targets are determined). The previous year’s average performance is presented to facilitate comparisons with the current year’s performance. Monthly performance is directly linked to data entry sheets and populates automatically as the data are entered. Quarterly performance has functions that compute the performance levels from the monthly data. The trends column uses sparklines which portray monthly trends from January to December. Variance indicates the percent difference between the benchmark/target and the quarterly performance. The annual column indicates performance from the beginning of the year up to the current date. Literature references provide the literature sources from which benchmarks are derived. Hazard vulnerability analysis provides rating metrics for the risk score based on the likelihood of risk occurrence as well as the consequences/severity of occurrence as per the guide obtained from institution’s risk program.
The KPI’s monitoring is enhanced visually by allocating colour codes to various performances across the month and quarters. The conditional formatting feature available on Microsoft Excel is utilized to promote visual assessment of indicators. In the configuration, green portrays an indicator that has met benchmark or target set or exceeded. The colour red appears if the indicator measure’s level is below the benchmark/target. Amber appears if the performance is below benchmark but higher than the average performance of the previous year. White is present if for that particular period, the indicator is not applicable and finally the colour grey represents ‘grey areas’, where data is due but not yet populated. These entire colour codes are set to change automatically as the data is input into the linked data entry sheets. The final indicator measure on each QVR master sheet is the measure of timeliness of data provision expressed as a proportion of all the indicators which have data populated by the 5th of each subsequent month out of all indicators measured by the unit/department. The illustration of the colour code configuration is as shown in Figure 6.
Action plan tab consists of a platform that aids in documenting corrective actions arising from the performance reviews/evaluation. Reviews are made at least once a month and gaps listed down in the action plan’s sheet. The gaps are identified by checking the colour codes, the trends column as well as the variance columns (see Figures 5 and 6). If an indicator presents more reds compared to other indicators this suggests a gap that needs to be dealt with, similarly, if the trends column shows a poorly performing trend or a sudden change either upwards or downwards this presents an opportunity for root cause analysis. Finally, in the variance column, a greater percentage variance presents an opportunity for further action. Figure 7 portrays the components of the action plan tab. It has a serial number column, month, the indicator with undesired performance, gaps and why as identified through a root cause analysis, the action plan containing the corrective action, a timeline for closure, the individual responsible for follow-up, and review date as well as the additional comments. The process of evaluation hence takes both the quantitative component and qualitative one.
In order to ensure effective utilization of various components and features, training was provided to each new user. The previously taught users were intended to pass on the knowledge, skills, and competencies acquired to their colleagues within departmental units. Advanced training needs identified were provided by the Quality department through the M&E unit in collaboration with the ICT training unit. Gaining the right competences enhances the success of the M&E. Successful training refers to proper acquisition of competencies, skills, and knowledge which are imperative in achieving a working M&E system (Fletcher et al. 2014).
The hospital’s Information Communication Technologies’ (ICT) department developed a framework of maintaining back-ups daily on hard disk separate from the server hosting the SharePoint data. In addition, offsite archiving of the data happened monthly which enabled recovery of any loss, in case a user deleted an item erroneously. Every user underwent basic training on how to access and update items on the Microsoft SharePoint platform. A request for rights to access to the Microsoft SharePoint server by any new potential user was sent to the ICT in order for the access rights to be provided. This ensured coordinated control of who accessed at any time. The Microsoft SharePoint also indicates the access logs, thus the last person who accessed a certain document on the server can be determined. Once a new user was allowed access to SharePoint, passcodes for the files they need to access were provided and this ensured that only the relevant files were accessed by the appropriate SharePoint users. Access to SharePoint data was further enhanced by document-level control where each department set passwords for their own documents. The quality department M&E unit also instituted document protection passwords that prevented the structure of the QVR and the formulae contained in the worksheets from being altered. Data are not entered directly into the QVR but are fed in through primary data entry sheets with enhanced data validation restrictions ensuring only the required type of data is entered. As such, the QVR dashboards are accessible to user departments with passcodes that grant them read only rights.
This section provides a review of the accomplishments observed since adoption of the QVR as a hospital quality monitoring and evaluation system.
Continuity of the Monitoring and Evaluation System: Before implementation of the QVR reporting system, there was a major challenge in regard to data collection and sharing with the M&E unit for processing. In a situation where a data provider was away, data would not be available until such a data provider resumed work and shared it. The situation was worse in cases of service interruption through either staff dismissal or resignation, since some instances, a replacement would not be available immediately or the incoming staff might have been in need of an elongated period to adapt. After implementation of the QVR system, each unit head was required to be actively involved in the entire process. In addition, they were meant to involve their staff from the determination of the indicators, design of data collection and entry sheets as well as linking to the QVR spreadsheets on Microsoft SharePoint. The unit leaders were required to ensure availability of resources for continuous data collection and entry. Appointed staff from each unit were taught how to collect and enter the data which ensured continuity enough to withstand the effects of staff turnover. Adoption of the QVR system has increased tremendously since initial implementation; the initial year of implementation was 2017 where a total of 50 departmental units had functional QVRs which increased in year 2018 and 2019 with 60 departmental units having functional QVR dashboards. The participatory approach adopted in M&E enabled communication of results which enhanced sharing of information within the departmental units, as well as across the departmental units. Guidance was provided on what information was meant to be shared during the multidisciplinary meetings, leading to increased motivation to participate due to clarity on the expectations (Guijt 1999). The QVR system having been developed through Microsoft Excel formulae and linked across related units to departmental level, several departments linked to form institutional QVR dashboard provides an essential blueprint for a customized database system. Due to flexibility of Microsoft Excel formulae which can easily be handled, this provides a framework that can be adopted by others to improve automation through customized systems.
Reduced turnaround time from data collection to analysis: the QVR’s were monitored every beginning of the month i.e. by the 5th of each month to ensure that all the requisite data were collected and entered in good time hence impacting on utilization of the results. A score was allocated every month based on the availability of data as a proportion of all the KPIs in each unit/department’s QVR. Similarly, utilization of various features such as determination and review of an action plan, was further considered during establishment of the best performing departmental unit of the year. Utilization of the feature of the action planning increased from <10% in 2017 to 78.4% in 2018. A similar challenge had been experienced in Iran when a framework for M&E was being set-up, many indicators were noted to have stayed for long periods having not been updated leading to many gaps in health information monitoring (Abdi, Majdzadeh, and Ahmadnezhad 2019). A well-defined frequency of data gathering promotes the usefulness of information gathered (Guijt 1999).
Standardized format of monitoring throughout the institution: prior to the implementation of the QVR system, reports were prepared on standardized PowerPoint slides by respective departments with the assistance of the statistician. This was a time consuming exercise since the data had to be provided via email and then analysed. The QVR system provided a ready and standardized format of outputs that were not alterable and enhanced aggregation, tracking and comparisons which worked throughout the year after setting up. Creation of a standardized monitoring framework promotes comparability and usability (Abdi, Majdzadeh, and Ahmadnezhad 2019). It is also possible to obtain hospital wide trends on cross cutting indicators: arising from the standardized format of reporting that was made possible by the QVR system it became possible to aggregate data on indicators such as compliance towards hand hygiene practice, patient identification monitoring, client feedback gathering, etc. which were monitored across many hospital departments.
Saving man hours: Prior to the deployment of the QVR system there was plenty of time spent in transcribing data from paper forms into the M&E database. There was also time lost in the collection and transportation of primary (paper) data collection sheets from the source departments to the M&E unit. The QVR system provides a platform of networked dashboards which allows continuous monitoring of performance any time, as data is continually populated. Data validation commands customized in the dashboards prevent erroneous data entry cases. Before implementation of the QVR system, a data collector would spend over three days entering monthly data which would increase the chances of erroneous entries, a situation that was rectified by the QVR system, where data entry happens every day at a proportion of an hour 10% (5 to 6 mins), this also further facilitates easier monitoring and decision making through trend analysis at any time of the month.
Ease of communication to multiple stakeholders: data on matters of quality is communicated to various stakeholders who require different levels of data granulation in order to make decisions. As described above, this was previously done by the use of Microsoft Excel then exported into Microsoft PowerPoint slides and this lengthened the process of critical analysis and action planning. The effectiveness of this was altered dependence on extra labour force and technical knowhow in dealing with the multiple softwares involved. Currently through the use of QVR colour coded “traffic lights system” (Red, Green and Amber) trending and detection of low performing indicators has been simplified allowing decision makers to quickly focus on undesirable trends for corrective action.
Time saving in quality meetings: The use of the QVR has enabled quicker discussion and shorter quality meetings regarding monitoring of indicators. This is because the agenda gets focused around the salient results on the QVR dashboards. This improves efficiency and effectiveness in decision making (Izurieta et al. 2011).
Determination of Quality Improvement Projects: Each year the departments were meant to select a quality improvement (Q.I) project that would have a great impact on quality. The hospital departments in conjunction with the M&E unit were able to narrow down on quality improvement (Q.I) projects from a review of the trends in KPI performance. Hospital wide Q. I improvements were easily selected from the combined set of institutional indicators (institutional QVR). An example of this was the consulting clinics no-show project that sought to reduce the number of booked patients who did not appear for their scheduled clinics due to various reasons. The selected project was conducted in the year that followed in a bid to reduce chances of no-show cases.
Ease of data access: as mentioned previously, before rolling out the QVR system data were stored on local drives and accessing the same was a laborious process of sending email requests and phone call follow-ups. The QVR system made data availability easy through the Microsoft SharePoint platform; any authorised staff could access the data from the comfort of their workstations.
Data processing: through the use of built-in formulae, data are processed in real time enhancing efficiency and usability of the results. This reduces time lag of data processing as well as workload on the part of the M&E unit.
Elimination of redundancy: The previous system was rife with instances of multiple copies of reports residing with different users as the review was not centralized and the reports were forwarded from one user to their seniors for reviews which at times same wouldn’t be reverted back. Often the quality reports would have variations due to editing. The QVR system now assures a single and most up-to-date copy of data sheets and reports that are widely accessible to all users.
Automatic back up: The Microsoft SharePoint server enables automatic backup of the input data as long as the internet connection remains uninterrupted.
Monitoring and evaluation is fundamental in quality performance of a hospital through an elaborate system of establishment, measurement and assessment of key performance indicators over time. A quality system that allows for participatory development of indicators from the smallest unit/department to the highest hierarchy in the institution ensures exhaustive coverage of scope of quality monitoring. A spreadsheet program, despite its simplicity, allows for flexible linking and computation of the chosen indicators. The Microsoft SharePoint server platform enables users to access and update indicators with ease via intranet, made possible through linked spreadsheets. Pointers such as colour codes, sparklines, and benchmark-achievement variance computation allows for identification of performance gaps. Timely identification of these gaps enhances propitious reviews coupled with a well-designed standardized root cause analysis and action plan all-inclusive in the system. Use of this system greatly enhanced quality performance monitoring in the hospital, identification of major bottlenecks that warranted hospital wide projects or departmental level projects. The QVR system enhanced efficiency and accuracy of quality monitoring from data collection through to performance reviews/evaluation. Development and setting up of the QVR system required seamless internet connectivity, a deeper understanding of Microsoft SharePoint server and spreadsheets program by the technical team, and a trained manpower for data entry and utilization of results. Due to flexibility of the spreadsheet programme and ease with which dashboards can be manipulated coupled with availability and low cost of acquiring the software, institutions mounting a framework of monitoring KPIs may adopt their use. The framework also provides an opportunity to understand how well various indicators can be measured and monitored. The system provides a blueprint for a customized M&E database system.
1. Knowledge gap in technology, it took some time for some users to become acquainted with the Microsoft SharePoint access and utilization. This was overcome through continued training and having always an M&E and ICT staff member reachable to sort out any emerging issues.
2. Limited knowledge in Microsoft Excel usage; this was overcome through rigorous training between the quality department’s M&E unit and the ICT department’s training unit.
3. Staff turnover affected seamlessness with which data were captured. Some newly recruited staff encountered challenges before they were acquainted with the system.
4. Server downtime - during server downtimes access to the Microsoft SharePoint and documents saved therein was a big challenge until the server functioning was restored.
5. Limitation to access documents (essentially) Microsoft Excel files with data validation features or those which were password protection were not easily accessed using several browsers with the exception of Internet Explorer. This was sorted by setting the default browser to Internet Explorer for users’ easier access and creation of desktop shortcuts to the SharePoint folders.
6. Only a single user at a time could edit a document on Microsoft SharePoint at any given time, during this time other users could only access the document on “read only” mode. For this reason we created multiple data entry sheets to allow many users to continue doing data entry seamlessly and encouraging them to close the data entry screens as soon as they completed accessing or updating.
All data underlying the results are available as part of the article and no additional source data are required.
First and foremost we would like to thank the almighty God. Special thanks to each and every author who took part in the compilation of this paper. The quality department, ICT department as well as the entire fraternity of the Aga Khan University Hospital, Nairobi for their continued support since inception of the QVR system.
Views | Downloads | |
---|---|---|
F1000Research | - | - |
PubMed Central
Data from PMC are received and updated monthly.
|
- | - |
Is the rationale for developing the new method (or application) clearly explained?
Yes
Is the description of the method technically sound?
No
Are sufficient details provided to allow replication of the method development and its use by others?
No
If any results are presented, are all the source data underlying the results available to ensure full reproducibility?
No
Are the conclusions about the method and its performance adequately supported by the findings presented in the article?
No
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Business Management; Operation Management;Quality Management; Total Quality Management .
Is the rationale for developing the new method (or application) clearly explained?
Yes
Is the description of the method technically sound?
No
Are sufficient details provided to allow replication of the method development and its use by others?
No
If any results are presented, are all the source data underlying the results available to ensure full reproducibility?
No
Are the conclusions about the method and its performance adequately supported by the findings presented in the article?
Yes
Competing Interests: No competing interests were disclosed.
Reviewer Expertise: Quality Improvement, Implementation Science, Health Services Research, Administrative Data Analysis, Systematic and Scoping Review
Alongside their report, reviewers assign a status to the article:
Invited Reviewers | ||
---|---|---|
1 | 2 | |
Version 2 (revision) 25 Jun 24 |
||
Version 1 02 Sep 21 |
read | read |
Provide sufficient details of any financial or non-financial competing interests to enable users to assess whether your comments might lead a reasonable person to question your impartiality. Consider the following examples, but note that this is not an exhaustive list:
Sign up for content alerts and receive a weekly or monthly email with all newly published articles
Already registered? Sign in
The email address should be the one you originally registered with F1000.
You registered with F1000 via Google, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Google account password, please click here.
You registered with F1000 via Facebook, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Facebook account password, please click here.
If your email address is registered with us, we will email you instructions to reset your password.
If you think you should have received this email but it has not arrived, please check your spam filters and/or contact for further assistance.
Comments on this article Comments (0)