ALL Metrics
-
Views
-
Downloads
Get PDF
Get XML
Cite
Export
Track
Research Article

Pay-for-performance in resource-constrained settings: Lessons learned from Thailand’s Quality and Outcomes Framework

[version 1; peer review: 1 approved, 1 approved with reservations]
PUBLISHED 18 Nov 2016
Author details Author details
OPEN PEER REVIEW
REVIEWER STATUS

Abstract

Introduction. Many countries have introduced pay-for-performance (P4P) models to encourage health providers and institutions to provide good quality of care. In 2013, the National Health Security Office of Thailand introduced P4P, based on the UK Quality and Outcomes Framework (QOF), as a mandatory programme for primary care providers. This study aims to review the first year of the Thai QOF policy, focusing on the key features of its formulation and implementation stages. Methods.This study used a mix of data collection approaches, such as literature review, in-depth interviews with QOF managers, and focus-group discussions with health officers and practitioners. Data were collected between June and August 2015 and transcribed and analysed using qualitative content analysis (interpretation of the content of text data through the systematic classification process of identifying themes or patterns). Two consultation meetings were organised to verify the preliminary findings. Results. Based on the UK model, the Thai QOF was formulated without formal consultation with key stakeholders. Additionally, programme managers adopted a ‘learning by doing’ approach, since Thai authorities were already aware of health system limitations, such as insufficient human and financial resources and unreliable databases. There were also problems with QOF implementation, as follows: 1) deducting the budget from the annual payment for ambulatory care made the policy unjustified because providers did not receive full subsidisation of their service delivery; 2) lack of key stakeholder engagement resulted in miscommunication, and subsequently misunderstanding and inadequate coordination, for the translation of QOF policy into action; and 3) the unreliability of the IT system led to inaccurately-reported data on service delivery, thereby adversely affecting performance. Conclusion.There is still room for improvement in formulating and implementing the Thai QOF programme. Policy makers and programme implementers at both the national and international levels can benefit from this study for ensuring effective policy transfer and implementation of future QOF programmes.

Keywords

Quality and Outcomes Framework, policy formulation, program implementation, Pay for Performance, Primary care quality

Abbreviations

UHC - Universal health coverage

UCS - Universal coverage scheme

NHSO - National Health Security Office

MOPH - Ministry of Public Health

CUP - Contracting unit for primary care

PCU - Primary care unit

P4P - Pay-for-Performance

QOF - Quality and Outcomes Framework

KPI - Key performance indicator

Introduction

Thailand achieved universal health coverage (UHC) in 2002 through the implementation of a universal coverage scheme (UCS) for the majority of the Thai population (75%) in addition to the existing government-funded health insurance schemes: the Civil Servant Medical Benefit Scheme for public employees and dependents, and Social Health Insurance for formal-sector private employees1. The Thai UHC focuses on promoting primary healthcare with an emphasis on disease prevention and health promotion, and these are also in line with new sustainable development goals2. Additionally, the facilities of the healthcare purchaser, the National Health Security Office (NHSO), and the provider, the Ministry of Public Health (MOPH), were separated following the introduction of the UHC. With this split, the NHSO holds more than half of the total health budget, while the MOPH owns more than 80% of government health facilities.

The UCS requires its beneficiaries to register in a catchment area at the contracting unit for primary care (CUP). In general, a CUP includes one district hospital and several health-promoting hospitals or primary care units (PCUs). District hospitals, which are staffed with physicians, nurses and other allied health professionals, offer both primary and secondary care services. On the other hand, health-promoting hospitals, which are staffed by nurse practitioners and public health officers, only provide primary care, community services, health promotion and disease prevention services1. The NHSO allocates the budget for ambulatory services on a prepaid capitation basis, i.e. a fixed rate per population registered to each CUP, which is equivalent to 90USD per capita. According to the National Health Security Act, the NHSO is entitled to provide contracts to qualified CUPs3; however, the purchaser has limited choices, especially in rural areas where MOPH facilities are predominant.

In 2010, the NHSO introduced the first pay-for-performance (P4P) programme, called on-top payment, in order to reduce variations in quality and accessibility of care provision by encouraging CUPs to improve infrastructure and staffing4. Since this programme was criticised for not clearly contributing to the quality of services and health outcomes, it was replaced by the Quality and Outcomes Framework (QOF) in October 20135. As a P4P initiative, the QOF incentivises health providers to improve primary care quality in key predetermined areas, namely (i) health promotion and disease prevention; (ii) primary healthcare services; (iii) organisational development and management; and (iv) services targeted to local need (Figure 1). There are two types of QOF indicators: 1) core indicators, used at the CUP level throughout the country, and 2) local indicators, developed by regional health boards, consisting of NHSO and MOPH senior officers at Provincial Health Offices and regional, provincial, and district hospitals. The core indicators comprise nine quality measures in five key primary care services, including maternal and child health, cervical cancer screening, management of asthma, diabetes and hypertension, and the structure of primary care organisational development, e.g. the percentage of people who have access to a physician. Some of these indicators, such as the percentage of pregnant women that received antenatal care before 12 weeks of gestation, are also adopted by the MOPH as a key performance indicator (KPI) for monitoring and evaluating the service delivery in its health facilities. However, some of the QOF indicators, such as the percentage of diabetes patients admitted to the hospital due to short-term complications from diabetes, are not included in the MOPH’s KPI list. Besides, this initiative allows regional health boards to develop regional indicators with the aim of decentralising decision-making power, motivating participation of local actors, and addressing local health problems and health delivery factors. As a result, the numbers and sets of indicators differ across the 13 regions of the country.

f1ccdc6f-8ccd-4083-bea1-572c9bdf8ddd_figure1.gif

Figure 1. QOF indicators and point value in 2014 (total 1,000 points).

QOF = Quality and Outcomes Framework, OP = Out Patient, PCU = Primary Care Unit.

The Thai QOF was implemented in 1,293 CUPs country-wide. Achievement against each indicator is calculated on an annual basis using national patient care databases and given a point value. In principle, the total points achieved are then converted into financial value, which is allocated to CUPs. In the first year, the NHSO disseminated QOF details e.g. indicators and points through its regional offices, regional health boards, and CUPs. Then, the NHSO requested each regional health board to develop regional QOF guidelines together with local indicators. Once providers delivered services, they recorded such service provision in the existing MOPH database. Thereafter, the information was transferred to the provincial data centre, which manages the data and submits it monthly to the MOPH (Figure 2). Subsequently, the NHSO extracts selected fields from the database and analyses the data related to core QOF indicators, with the analysis details reported back to the regional health boards. Meanwhile, local indicators are collected and analysed locally at the regional level. To fund this, the NHSO allocates the QOF budget to the regional health boards according to the number of people registered to the health facilities in the region and requests the regional health boards to allocate the payment to the CUPs. The reason for this is that the NHSO believes this is reasonable given that the QOF budget was part of the budget for ambulatory services.

f1ccdc6f-8ccd-4083-bea1-572c9bdf8ddd_figure2.gif

Figure 2. A flow diagram of service delivery data from PCUs to the MOPH databases and budget allocation from the NHSO to the CUPs.

PCUs = Primary Care Units, MOPH = Ministry of Public Health, NHSO = National Health Security Office, CUPs = Contracting Units for Primary care, QOF = Quality and Outcomes Framework.

One year after the QOF implementation, key stakeholders at the national, regional, and peripheral levels raised concerns about the mismanagement of the scheme and highlighted the need for improvement in different aspects of the programme. This study was commissioned by the NHSO to review the first year of the Thai QOF programme, with focus on key features of the policy formulation and implementation stages. These include the policy design, implementation gap, impeding factors, and health providers’ perceptions towards the policy. This paper describes the results of this evaluation as well as the lessons learned and implications for the QOF in Thailand.

Methods

A qualitative study was performed, including a review of literature, in-depth interviews, and focus-group discussions. Data were collected between June and August 2015.

Data collection

Document review. Key international publications and unpublished research reports related to the UK QOF and relevant documents related to the Thai QOF, including published literature, research reports, and health policy documents, were reviewed. International publications were searched in PubMed (https://www.ncbi.nlm.nih.gov/pubmed/). The search was conducted using key words, including ‘QOF’ and ‘quality indicators’. Papers published between January 1, 2004 and July, 2015 were considered. Unpublished research reports were identified by UK researchers (University of Birmingham). The NHSO and its regional offices offered relevant documents in Thai, including published literature, research reports, and health policy documents.

Key informant interviews. In June 2015, in-depth interviews were conducted with 11 key informants using semi-structured interview guides (Supplementary File 1). Purposive sampling was performed; the informants were recruited according to the criteria that they were responsible for formulating QOF policy or managing the programme at the national and regional levels. Recruitment was performed under the supervision of the QOF programme managers in the NHSO. The informants included five executives and two programme managers at the national level, and three executives of NHSO offices and one programme manager at the regional level (Table 1). All QOF managers at national level were interviewed. However, at regional level, we categorised 13 NHSO regional offices into 5 groups. The first group, Bangkok, was selected due to differences in context, such as population density and mobility, patient’s help seeking behaviours, and lifestyle, as well as primary care system. Therefore, Bangkok has implemented different QOF indicators, payment criteria and management strategies. Regional offices in the south, north, north-eastern and central part of Thailand were randomly selected (1 office per region) for interviews.

Table 1. Characteristics of 11 interviewed informants.

NHSO = National Health Security Office, QOF = Quality and Outcomes Framework.

CharacteristicNumber of
interviews
Gender
            Male8
            Female3
Organisation
            NHSO5
            NHSO regional offices6
Responsibility
            Managing QOF at national level3
            Managing health promotion and disease
            prevention programme or secondary and tertiary
            care programme
2
            Managing QOF at regional level6

Focus group discussions. Two separate focus group discussion sessions were convened in July 2015 (one for 7 representatives and other for 8 representatives). A researcher (ST) led the discussion. The discussions were guided by sets of predefined questions covering the benefits of the QOF to health facilities, the relationship between QOF performance, scores and payment, QOF payment allocation, barriers, and suggestions to improve the QOF programme. Purposive sampling was performed. The recruitment of participants was carried out in two steps. First, two provinces in each region (12 regions, excluding Bangkok Metropolitan) with the highest and lowest QOF scores were selected. Second, different types of health facilities (e.g. district hospitals and health-promoting hospitals), as well as provincial and district health offices from the 24 provinces, were selected based on consultations with NHSO staff in the regions. As such, 24 representatives from these organisations, who had been involved in the QOF introduction either as supervisors or primary care providers, were invited to participate in the meetings. Fifteen informants agreed to participate (Table 2).

Table 2. Characteristics of 15 key informants who participated in the focus group discussions.

CharacteristicNumber of
informants
Gender
            Male3
            Female12
Organisation
            Provincial health office3
            District health office1
            Health facility10
Geographical location
            Northern part4
            Middle part4
            North-eastern part4
            Southern part3
Position
            Quality assurance board member4
            Data manager2
            Health facility director1
            Health practitioner8

Data analysis

Interviews and focus group discussions were audio-recorded, transcribed, and analysed using qualitative content analysis, a research method for the interpretation of the content of text data through the systematic classification process of identifying themes or patterns6. A single researcher read through the transcripts repeatedly to derive the key issues of the text data that were then sorted into categories and themes based on how different issues were related and linked. After that, the research team discussed the emerging categories, made themes, and changed categories where appropriate.

Quality assurance

Two consultation meetings were organised in July 2015. The aims of these meetings were for the research team to check that key information had been collected and to verify preliminary findings. The first meeting involved 27 key stakeholders at the national level, including policy makers from the MOPH, researchers from universities, as well as representatives from the NHSO and its regional offices. At the second meeting, 31 participants attended, including representatives from the NHSO, provincial and district health offices, health facilities, non-governmental organisations, and patient groups. Six informants in this study attended the consultation meetings.

Results

Four major themes emerged from our analysis from the data collected in this study: policy formulation, programme implementation, problems with QOF implementation, and provider perceptions on the QOF programme. These are described in detail below.

Policy formulation

Although P4P policies were implemented in many countries with all types of income levels (high, middle, low), the Thai QOF was based on the UK model, since the NHSO believes that the UK QOF is the most famous and is implemented on the largest scale. In addition, Thai policy makers are familiar with the UK UHC model due to several staff exchanges and study visits. Nevertheless, Thai authorities were aware of significant differences in the health delivery systems of the two countries, and also anticipated the poor performance of the Thai QOF as a result of health system limitations, such as insufficient human and financial resources, unreliable databases, and conflict in some policy areas between the MOPH and the NHSO. Importantly, the newly-established initiative had never been piloted. Despite this, the UCS executives maintained that they were confident in introducing this policy since the NHSO, health providers, and officers had gained experience in UCS management from the earlier stage of the P4P programme. Furthermore, QOF managers argued that they adopted a ‘learning by doing’ approach, meaning that different stakeholders could learn through their experiences and adapt to the programme accordingly. The QOF managers also perceived the policy implementation as a capacity strengthening exercise for NHSO staff and providers’ networks.

Regarding the development of quality indicators, the NHSO formulated indicators and associated policy without consultation from key stakeholders, such as the MOPH, health providers, and professional organisations. However, personal consultation with respective experts in the Ministry was carried out for some indicators. Two broad criteria were used to select indicators: 1) an indicator would not increase workload on data entry and reporting because the performance could be measured by using data in the existing MOPH database; and 2) the focus of indicators was placed on the performance at CUP level, as opposed to the outputs and outcomes of each health facility.

Programme implementation: plan vs. reality

The NHSO did not start to disseminate QOF information to its regional offices until October 2013, which was the planned start date for the programme. After this initial dissemination, regional health boards were required to develop local indicators and criteria for the QOF budget allocation. As pointed out by some informants, this resulted in a delay of approximately three to four months in the development of the regional guidelines and its dissemination to respective institutes, including primary care providers. As such, the QOF could not begin until January 2014 in some regions. Aiming to allocate the QOF budget to all CUPs by September 2014, the NHSO decided to use data that the health facilities submitted to the MOPH between April 2013 and March 2014 to analyse QOF performance for the core indicators. Hence, the first-year QOF was based on the information during that period because it took some time for preparation, and the programme began later than expected. This means that the measurement in the first year of the QOF was mainly based on the performance of approximately 9 to 10 months before the providers had been informed about the QOF indicators.

“… QOF performance was measured based on the data of the 3rd and 4th trimester of last year [April to September 2013] and 1st and 2nd trimester of current year [October 2013 to March 2014]. The scores were not associated with current work performances. Additionally, the measurement did not align with the fiscal year, increasing difficulties in creating work plans.” (Health practitioner).

As mentioned above, two disparities between the NHSO’s QOF and MOPH’s KPI exist: 1) there are different indicators; and 2) the same indicators have different templates and timeframes for data entry and measurements. These two factors thus affected the planning and working process of data submission from health facilities.

Key informants asserted that QOF monies were not consistently allocated according to provider achievement, and high scores might not be associated with high quality primary care delivery due to two reasons. First, some local indicators did not involve healthcare quality or the performance of health providers, e.g. the amount of research conducted annually in particular facilities, and the use of data from IT systems to inform health service planning and delivery. Second, the criteria for QOF payments in different regions were not standardised, as the regional health board had discretionary power to make decisions.

An analysis by the NHSO suggested that the QOF budget was allocated according to the QOF scores in only 6 out of 13 regions5. In the other regions, the QOF scores were taken into account together with other criteria, such as population size and deprivation level of the catchment area (e.g. hardship and conflict areas).

“The allocation of the QOF budget was based on the financial status of the CUP. In practice, the amount depended on the performance of the health facilities in the CUP. I thought that explicit criteria or guidance for the CUP were necessary.” (Health practitioner).

“In my region, the regional health board agreed to allocate the QOF budget based on [the] number of people registered in the area (40%), and QOF scores (60%).” (Regional NHSO officer).

Impediments in QOF implementation

During the first year of the Thai QOF programme, the crucial impeding factors were the inherent conflicts between the NHSO and MOPH, and weaknesses in the existing IT system. Consequently, there was miscommunication and inadequate coordination between healthcare workers, MOPH and NHSO staff, as well as inaccurately-reported data, lack of capacity for data entry and management, and errors in data transfer from peripheral offices to the national authority.

As health workers pointed out, the QOF was perceived as an NHSO-owned initiative, while the MOPH had developed its own policies to deal with priority health problems and improve primary care services. The lack of engagement between the NHSO and MOPH in the policy formulation phase resulted in miscommunication, culminating in misinformation and inadequate coordination for the translation of QOF policy into action. The focus group discussions made it quite apparent that senior health officers in the MOPH and its central departments were not aware of the NHSO QOF. Although some MOPH general inspectors acknowledged the introduction of the QOF in their regions, they paid attention only to the performance of healthcare delivery according to the MOPH’s direction and associated indicators. Hence, technical supervision and administrative guidance provided by the inspectors for QOF-related activities in most regions were limited.

The other critical problem that surfaced in the QOF’s introduction was the unreliable IT system, resulting in inaccurately-reported data on service delivery. Interviewees asserted that the capacity for data entry and management at CUPs, PCUs, and district health offices was inadequate. Moreover, interviewees in different institutions argued that health workers in some settings intentionally made up data in order to gain high QOF scores and payments. The worst case, as discussed during a focus group meeting, involved attempts at data manipulation.

“Some practitioners entered data accurately, while others intentionally made up data. So, they [the data] did not really reflect the true performance.” (Health practitioner).

“I had a field visit at a province. In this province, they [provincial health officers] were not concerned about under-recording, but over-recording. This was really happening.” (Data manager of a CUP).

Informants also reported technical errors in data transfer from peripheral offices to the national authority.

“I did not realise what was going on. Our performance last year was zero for 5 indicators. Moreover, the OP visits were marked zero. I believe that the problems originated from the data linkage system between our hospital and the regional health data centre.” (Health practitioner).

“Once we submitted our data to the health data centre, we found that our performance did not meet the indicator targets. Only 5 indicators, related to service provision, passed the assessment. When comparing with other health facilities, they also passed approximately 5–6 indicators, which I did not believe was representative of service delivery. I thought the issue was the database.” (Health practitioner).

The poor reliability of data has also been illustrated by comparing results from different databases and surveys on certain indicators. Table 3 shows that the coverage of some primary care services as QOF indicators significantly differed from the findings of a survey conducted by Mahidol university and a report from the MOPH during the same period, as suggested by a published document of the NHSO7. Owing to the limited capacity of the IT system, health personnel in PCUs could not obtain feedback on their performance from the provincial data centres. Without such information, it was difficult for PCUs to confirm the accuracy of their QOF scores and payments with CUP managers.

Table 3. Comparison between the average performance of the CUPs derived from the existing MOPH databases and the average performances of the CUPs derived from other sources in the 2014 fiscal year.

Two indicators were highlighted, due to the availability of the data from other sources to make comparisons.

IndicatorInformation derived from the
MOPH databases
Information derived
from other sources
Percentage of pregnant women receiving
antenatal care 5 times during gestational
period
10%49% (1)
Percentage of coverage of cervical
cancer screening in women between
30–60 years within 5 years
36%67% (1)
68% (2)

Sources of information: (1) A Survey conducted by Mahidol University 2014, and (2) A Report of the MOPH, 2014, adapted from a published document by the NHSO7.

Provider perceptions, inadequate communication, and management issues

Most interviewees in this study agreed with the QOF policy principle that monetary incentives would be effective in enhancing service quality in the Thai UHC context where financing and other resources are scarce. Some also pointed out that the policy helped strengthen teamwork among district hospitals and health-promoting hospitals in terms of patient transfer to receive proper care at the secondary level. They stated that in order to be high achievers, health facilities in a particular CUP would need to cooperate, set service delivery plans together, and improve their referral system. However, besides the problems arising from the MOPH-NHSO conflict and unreliability of data, many weaknesses and associated implementation gaps were observed. First, health workers perceived that the programme was managed in an unfair manner. Given that the QOF budget was deducted from the annual capitation payment for ambulatory care, the policy was not justified as primary care providers did not receive full subsidisation of their service delivery.

“The budget for the QOF programme was deducted from the OP/PP budget [budget for out-patient/health promotion and disease prevention services]. It should have been from other sources. The OP/PP budget is actually aimed at subsidising service deliveries, so it isn’t fair for health providers [since they may receive less money than they had received previously and this may not be sufficient for providing ambulatory services].” (Health facility director).

“…the calculation of the budget for ambulatory services was based on the number of people registered in the catchment areas. Allocation of this budget based on the performance of health providers isn’t fair. The allocation should have been based on the same principle [payments based on the number of registered population].” (Regional NHSO officer).

Additionally, providers perceived that allocating a budget based on performance was unfair because the NHSO collected a proportion of the budget for ambulatory services, and allocated this on the basis of each CUP performance instead of the registered population. While the total amount remained the same, CUPs with high QOF scores were able to receive higher QOF payments, thereby taking a proportion of other CUPs’ budgets in the same region.

“It [QOF budget allocation] was unfair… [and] inappropriate because a province that achieved high QOF scores would receive a larger budget and [would be] taking from other provinces’ UC budget.” (Health practitioner).

As also perceived by hospital administrators and health workers, the unfairness was exacerbated when the allocation of the QOF budget did not rely on actual quality improvement in service delivery at the CUP and PCU levels. Furthermore, there was perceived prejudice in performance measurements owing to the fact that some QOF indicators and targets were relatively difficult to achieve in certain circumstances, e.g. providing screening tests for non-communicable diseases in an area with a large population, urbanised culture and lifestyle, and high rate of labour migration to other areas.

Second, inefficient transfer of information from policy makers to programme managers and practitioners was emphasised by interviewees as an important drawback of the Thai QOF. Traditional, bureaucratic communication approaches that involved lengthy official documents and formal, face-to-face meetings with executives crucially impeded the dissemination of messages. As maintained by some informants from PCUs, they did not have an in-depth understanding about the QOF because CUP representatives who attended the regional meetings did not relay the obtained information. It was also argued that the information deficit resulted in poor attitudes and non-compliance with the policy among service providers.

Third, most QOF indicators, such as coverage of screening or occurrence of complications as a result of poor disease management, were designed to measure the performance at the CUP level, even though district hospitals and health-promoting hospitals are different entities with different financial management. As such, the QOF budget received by the CUP needs to be divided among health facilities. During the interviews and focus group discussions, it was evident that without guidance for QOF budget allocation within the CUP, staff at health-promoting hospitals found it difficult to negotiate a fair share of the budget with the directors of district hospitals and perceived that they were the worse off under this programme. Although some CUPs demonstrated the improved collaboration expected from the QOF implementation between district hospitals and health-promoting hospitals, other CUPs witnessed conflicts between the two entities on account of the QOF budget allocation.

“It was not told how much each indicator would deliver. So it was not sure how the budget should be further allocated to the health-promoting hospitals. We had to find [a] consensus on explicit criteria for budget allocation in order to avoid conflicts.” (District health officer).

“This method of allocation [allocation of the QOF budget through CUP] caused disputes among the health facilities because there were no explicit criteria. Each CUP board could create their own way to allocate the budget.” (Health practitioner from a health-promoting hospital).

Discussion

As with many health insurance schemes, there has been a strong political will and commitment within the NHSO to improve the quality of primary care under the Thai UCS8. P4P was chosen by the NHSO to improve service quality and reduce the variation in performance of primary care providers. Moreover, this initiative, if introduced properly, may be effective in improving the governance of resource allocation, as it monitors and evaluates the performance of CUPs in terms of inputs, processes, and outcomes. However, this study suggests that the processes of QOF policy formulation did not follow the principles of evidence-based and participatory policymaking, which has long been embedded in Thai health systems, including the NHSO’s policy decisions9. The selection of QOF indicators as a crucial policy instrument is an appropriate illustration. The initial selection of indicators by the NHSO without external stakeholder engagement is not unique, but other countries that have implemented P4P have evolved the indicator development and evaluation process to be more participatory. For example, the National Institute for Health and Care Excellence (NICE) has led the indicator development and evaluation process for the QOF on behalf of the NHS in the United Kingdom since 2008. They utilise a systematic process involving experts in respective disciplines, NHS managers, and practitioners in order to ensure that these indicators and measurements are technically robust, effective in quality improvement, and well-accepted by key stakeholders10. In contrast, the Thai QOF development did not follow an explicit process for indicator development; it was based on non-technical criteria and informal consultation with a limited number of stakeholders. Furthermore, it is not clear whether regional health boards took into account the aims of the QOF while developing local indicators, as some indicators were not relevant for measuring the quality of primary care.

Besides relevant performance indicators and measurement methods, reliable databases on service delivery and civil registry are necessary for the P4P model. This analysis suggests that the health-related IT systems and databases were the weakest component of the Thai QOF programme. This was compounded by the lack of capacity and technical expertise of personnel in different cadres at the country and peripheral levels. An on-going study for the development of QOF indicators in Thailand reveals various types of inaccurate information contained in the MOPH databases11. Since health information and data are considered building blocks (http://www.wpro.who.int/health_services/health_systems_framework/en/) of the health system, it is likely that this weakness also affects a broader range of initiatives managed by the MOPH and NHSO. One example of an unsuccessful introduction of computerised information systems in Thailand is the abandoned Telemedicine Network launched in 1998; it failed because of the lack of IT skills among health professionals, the low level of system acceptance among users, and the rapid changes in the IT system12,13. In addition, a study on the QOF in Switzerland14 revealed that an incomplete database was unable to reflect the quality of healthcare delivery. Furthermore, different studies suggest that healthcare workers should be able to perform proper data entry, and these data should be used as feedback for improving the quality of healthcare delivery1416.

In the views of QOF managers and some practitioners, the Thai QOF should be able to strengthen collaboration among district hospitals and health-promoting hospitals. District hospitals should also provide support to health-promoting hospitals in terms of health personnel, medicines, medical devices, technical support, and quality control of services. It is similar to the findings of a systematic review by Gillam et al, 201217, which indicated that the UK QOF strengthened teamwork among practitioners in the health facilities. The improved cooperation within the CUP as asserted by some interviewees has not been clearly evident in this evaluation. Ironically, however, some informants mentioned about the inadequate coordination between district hospitals as CUP leaders and PCUs, especially in terms of financial resource allocation. This may be explained by several reasons. First, it was the first year of programme implementation and the leaders and staff of health facilities within the same CUP are still learning and adapting to the programme; therefore, collaboration may be improved in the future. Second, health-promoting hospital staff may have less incentives than staff in district hospitals to cooperate with district hospitals because the QOF budget is allocated directly to the district hospital (as CUP’s main contractor), while health-promoting hospitals have no negotiating power with the district hospital. Third, a lack of central guidance as to how the incentives should be allocated led to the inconsistency of QOF payment allocation from the CUPs to health-promoting hospitals.

A key success mechanism of P4P implementation to improve quality of healthcare is that health providers are well aware of and appreciate the incentives offered, and have the ability to make behavioural changes or strengthen their service delivery capacity in order to achieve predetermined performance targets18. In the Thai QOF, however, due to the delay in the development of the indicators, inadequate policy communication, and IT issues, it was really difficult for providers to improve their performance in such a short period of time. The QOF score was measured from the performance carried out 8–9 months before issuing the indicators by the NHSO and the regional health boards. During this period, the providers were unaware of the policy, including incentives, and thus it is unreasonable to expect any improvement. Arguably, performance measurements in the second year should be more justifiable than the first year because providers already know what the core indicators are, whereas local indicators can be changed annually depending on the negotiations in the regional health boards. This case study reflects the importance of setting up and keeping an appropriate timeline for the implementation of P4P programmes. In other settings, such as in the UK QOF, the programme can aim to increase a providers’ performance by establishing a reporting system that provides on-going feedback about achievement in real-time. The UK QOF has also sought to incorporate the quality improvement potential of reputation incentives in parallel with financial ones through the public reporting of QOF achievements19. However, both of these systems rely on a robust and reliable IT system and it may not be possible to introduce them into the Thai QOF unless the NHSO overcomes its IT problems.

This study contributes to public policy literature as it provides empirical evidence from a developing country’s health systems for existing policy analysis models and theories. We learn that the UK QOF was transferred to the Thai setting with significant adaptation during the formulation stage, including indicator development, point value determination, budget, management, feedback, and database for the QOF. This resulted in great differences from the prototype. As pointed out by Dolowitz and Marsh20, the transferred elements can be ideologies, interventions, or administrative arrangements. In Thailand’s QOF, the transfer of the P4P concept and principles was relatively effective compared to the learning on instrumental details and programme structure, since the latter elements were influenced by the political and health delivery context. Such contextual factors, as well as the capacity for programme management and primary care provision at the subnational level, played a key role in the QOF implementation, and possibly the outcomes too. According to the implementation model, in order to meet new policy goals, government service providers usually seek to obtain the necessary information to guide policy execution, as well as draw lessons from several sources21. As such, policy learning on the QOF introduction might also take place locally by exchanging knowledge among practitioners and drawing lessons from past experiences, such as the introduction of the earlier phase of the P4P for infrastructure, and human resource development.

The findings of this study should be interpreted in light of certain limitations. The study relied heavily on stakeholders’ interpretation and perception towards the QOF. Despite the efforts to triangulate the information, researchers were not able to interview most policy makers from the MOPH, including executives and General Inspectors. However, a small number of these policy makers were able to participate in a stakeholder consultation meeting, and their views are crucial because they play an important role in the policy formulation and implementation process. Further study to evaluate the health outcomes accrued as a result of the Thai QOF is recommended. It should be noted that this may take a longer time frame for the evaluation and different study designs to be able to obtain an impact. Furthermore, studies on other aspects of the QOF are needed, such as costs, cost-benefit, value for money, and the contribution of the QOF to primary care improvement.

Conclusions

Although there were impediments in introducing the Thai QOF programme, the programme gained a strong political will and commitment from the NHSO to improve the quality of primary care under the Thai UCS. Lessons learned from the current Thai QOF will be useful for policy makers and programme implementers at both the national and international levels in ensuring effective policy transfer and implementation, not only for similar P4P programmes, but also for other public health initiatives.

Ethical approval and consent

Ethical approval for this study was obtained from the Institute for the Development of Human Research Protection (http://www.ihrp.or.th/), Thailand (document number, 970/2558). Written informed consent was obtained from the participants.

Data availability

Transcriptions of recorded interviews (in Thai) with key informants, and focus groups are available from Open Science Framework (https://osf.io/wqvfd/) DOI: 10.17605/OSF.IO/WQVFD22.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 18 Nov 2016
Comment
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
Views Downloads
F1000Research - -
PubMed Central
Data from PMC are received and updated monthly.
- -
Citations
CITE
how to cite this article
Khampang R, Tantivess S, Teerawattananon Y et al. Pay-for-performance in resource-constrained settings: Lessons learned from Thailand’s Quality and Outcomes Framework [version 1; peer review: 1 approved, 1 approved with reservations]. F1000Research 2016, 5:2700 (https://doi.org/10.12688/f1000research.9897.1)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Open Peer Review

Current Reviewer Status: ?
Key to Reviewer Statuses VIEW
ApprovedThe paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approvedFundamental flaws in the paper seriously undermine the findings and conclusions
Version 1
VERSION 1
PUBLISHED 18 Nov 2016
Views
22
Cite
Reviewer Report 31 Jan 2017
Stephen J. Gillam, Department of Public Health and Primary Care, University of Cambridge, Cambridge, UK 
Approved with Reservations
VIEWS 22
Context – The National Health Security Office (NHSO) in Thailand introduced a pay-for-performance programme in 2013 based on the UK’s Quality and Outcomes Framework (QOF) as a requirement of all primary care providers (via the MOPH). Presumably due to space ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Gillam SJ. Reviewer Report For: Pay-for-performance in resource-constrained settings: Lessons learned from Thailand’s Quality and Outcomes Framework [version 1; peer review: 1 approved, 1 approved with reservations]. F1000Research 2016, 5:2700 (https://doi.org/10.5256/f1000research.10668.r19815)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
Views
22
Cite
Reviewer Report 13 Dec 2016
David Hughes, Department of Public Health & Policy Studies, Swansea University, Swansea, UK 
Approved
VIEWS 22
While Thailand’s UHC reforms have attracted considerable international attention and praise, primary care has remained one of the more problematic aspects of the healthcare system because of concerns about staffing and quality.  The English-language literature remains thin when it comes ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Hughes D. Reviewer Report For: Pay-for-performance in resource-constrained settings: Lessons learned from Thailand’s Quality and Outcomes Framework [version 1; peer review: 1 approved, 1 approved with reservations]. F1000Research 2016, 5:2700 (https://doi.org/10.5256/f1000research.10668.r17819)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 18 Nov 2016
Comment
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions
Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.