Keywords
interprofessional education, interprofessional core competencies, graduate attributes, curriculum transformation, curriculum mapping
interprofessional education, interprofessional core competencies, graduate attributes, curriculum transformation, curriculum mapping
Any health professional university curriculum has an overall objective to ensure that the knowledge, skills and attitudes of the students are influenced by the curriculum and that attributes which would help students to be compassionate and inquisitive health professionals are instilled in them. McKean et al., highlights the need for medical educators to align their learning objectives with the core competencies of the role in question, if they strive to ensure that the students achieve a degree of competency.1 The limited studies in interprofessional education do highlight activities that may be used to facilitate the development of interprofessional core competencies, but no such studies have been conducted in South Africa.2,3 However, it is our understanding that, if health professional university students are exposed to the interprofessional core competencies effectively, it may result in health care professionals who have an improved understanding of interprofessional practices, thus improving practices within their specific professions. The Interprofessional Education Collaborative Expert Panel defines interprofessional core competencies as an integration of knowledge, skills, and values/attitudes that defines teamwork across the health professions and with patients, their families and communities they live in, with the intention of improving health outcomes.4 Interprofessional practice occurs when those health professionals from different disciplines work together with patients, families, carers, communities and each other to render comprehensive health care.5 For example, a person who recently had a stroke would be seen by a team of health professionals in one consultation for assessment and then planning a treatment programme with the client and their family and/or caregiver, instead of seeing each health professional separately.
Interprofessional competencies are being used gradually more and more by many professions to comprehensively describe ideas such as interprofessional collaboration.6 Interprofessional collaboration takes place when more than two professionals work together to achieve common aims to solve a variety of complex challenges.7 For example, the Royal College of Physicians and Surgeons of Canada’s CanMEDS competency framework has been embraced by professions such as nurses, chiropractors, paramedics, physician assistants, family physicians and veterinarians at a global level.8
In this study, a systematic review was conducted to determine if any research studies have been done in an South African context, which explored how learning and teaching activities were used to develop core competencies among students.9–13 This study found that no studies in South Africa had been conducted, to the best of the authors’ knowledge. Internationally, only five studies were found that incorporate interprofessional core competency development into their programmes.14 Of these five studies, the higher education institutions did not include all competencies in their learning and teaching activities and there was no evidence of the impact of their programmes with regard to improving health outcomes for clients, patients, families or communities. This study therefore suggests that a transformative curriculum is required to reflect interprofessional core competency development in health professions’ training over the continuum of learning. This study gains momentum from the findings of a previous study conducted by GF.9
In order to meet the needs identified in curriculum development, this study used a Delphi method approach to identify teaching strategies that aim to develop interprofessional competencies in undergraduate health care students at the University of the Western Cape. The study goes beyond not only identifying teaching strategies, but also considers assessment strategies that could be used to develop interprofessional education (IPE) curricula.
The study was approved by the Senate Research Committee of the University of the Western Cape, South Africa (registration number: 14/9/25). Participation in the study was voluntary and written informed consent was obtained from participants beforehand to use their data for analysis and publication. The data were collected and processed anonymously.
Denzin and Lincoln states that qualitative research offers methodological tools with which to understand the deeper meanings associated with multifaceted phenomena and processes in practice.15 Besides the traditional approaches to qualitative inquiry, such as grounded theory, phenomenology, constructivist inquiry, and narrative inquiry, the Delphi method is an additional approach not often highlighted in the literature. The Delphi method is a logical approach based on the philosophical assumptions of philosopher and educator John Dewey, who believed that social science research should directly relay and inform everyday practice and decision-making.16 According to Birdsall, the Delphi method stresses structured anonymous communication between the authors and the expert on a certain topic with the aim of reaching consensus in the areas of policy, practice, or organizational decision-making.17,18 The Delphi method used in research usually involves approximately three rounds of surveys that are distributed to a panel of experts, with each round being informed by responses to the previous one. The Delphi process can be continuously repeated until consensus is reached. In this study the Delphi method was used to reach consensus on the most appropriate activities and assessment methods to use in an interprofessional curriculum, that would assist in instilling interprofessional core competencies in undergraduate health care students.
Selection of the appropriate participants is regarded as one of the most important phases in the entire Delphi process, as it directly impacts on the quality of the results produced.19–21 Since the Delphi technique concentrates on prompting expert views over a short period of time, the selection of participants is usually reliant on the disciplinary areas of knowledge and skills required by the specific issue at hand.22 As interprofessional education is a relatively developing area in South Africa, and as such it was initially difficult to identify local experts in the field. The authors aimed to recruit 15 and 20 participants and names were garnered from the initial experts identified, so as to include as diverse a group of experts as possible. The authors approached 95 experts which were more participants than required, in the event that if some were not available, there would be enough participants for the study. Following this process, the participants made up a group of 29 participants. The first phase of the study had 11 out of the 29 invited participants positively respond to participate in the study, yielding an initial response rate of 37.93%. Ludwig15 indicates that “the majority of Delphi studies have used between 15 and 20 respondents”. Based on this, a decision to expand the expert panel base and invite more participants with a target of achieving a positive response of between 15 and 20 was made and executed. Forty potential participants from the African Interprofessional Network (AfrIPEN) were invited to participate in the study. Eight gave a positive response, making a total of 19 participants. The demographics of 17 of these participants can be found in Table 1. In round one, all 19 experts participated and in round two, 16 of the experts completed the questionnaires in the Delphi process.
The experts in this group were initially contacted via e-mail and came from various organisations, both local and international. International organisations included the Centre for the Advancement of Interprofessional Education (CAIPE), UK, University of Missouri, U. S, Suez Canal University, Egypt, University of Cairo, Egypt, University of North Carolina, U. S, University of North Texas, U. S, Curtin University, Australia, University for Development Studies, Ghana, and the University of Sudan. South African institutions included Stellenbosch University, University of the Western Cape, University of Cape Town, University of Pretoria, University of KwaZulu-Natal, University of the Free State, and Psych Care in Pietermaritzburg.
The authors obtained contact information of potential participants from publications of experts in high impact journals, organisations/networks like the Africa Interprofessional Education and Collaborative Practice Network (AfIN) and African Interprofessional Network (AfrIPEN). Participants contacted, also identified other experts and made their contact details available to the authors. All the identified participants and experts in the field of IPE received an invitation letter via e-mail, containing information regarding this study and a request for their assistance as an expert in the field of IPE (see Extended data49). A consent form was attached to the e-mail, which needed to be completed and returned to the authors, should they agree to participate in the study (see Extended data49). Once all the consent forms had been received, the participants were sent a link to begin the Delphi process, by completing an online questionnaire in Google Forms. The first section of the questionnaire included a demographic aspect whereby participants had to indicate their profession, years of experience in IPE, year level of student engagement in IPE and the average number of students engaged in IPE per annum. The Psychology Ethics Committee of the University of Aberdeen posits that it is good practice to assign a numerical reference to participants in research studies for the purposes of anonymity.23 This was particularly necessary in this study in order to track participants’ replies and verify their responses during the next round of the Delphi study. The questionnaire was based on the six interprofessional core competencies identified by the Canadian Interprofessional Health Collaborative (CIHC), whereby participants were asked to identify activities and methods of evaluation for each competency domain.24 The questionnaire was sent online, which allowed participants to complete it at a time and space in which they were comfortable. The authors enabled settings in Google Forms to be notified via e-mail when questionnaires had been completed by participants, according to their allocated participant number and, through this method, the panel of experts could keep track of the total number of completed questionnaires.
Prior to the Delphi study, the authors presented the two competency documents (CIHC & Interprofessional Education Collaborative Expert Panel) to faculty who collectively recommended the use of the six competency domains outlined by the CIHC.4,24 The participants in the Delphi study had to review the combined six competencies listed by the CIHC and the two additional competencies suggested by an Interprofessional Education Collaborative Expert Panel. For the purposes of this study, the focus is primarily on the six competencies listed by the CIHC, together with the additional core competencies of the Interprofessional Education Collaborative Expert Panel, i.e., values/ethics for interprofessional practice and roles, as well as responsibilities for the sake of comprehension. Round one required participants to list as many activities as possible, to instil each of the eight core competencies into undergraduate students. While listing activities, they had to think of different assessments that could be used to evaluate the different competencies.
During round two, the authors compiled a second questionnaire whereby participants had to rate the activities and assessment practices most favourable to instil IPE core competencies as presented in round one. The scale of reply extended from one to five, ranging from strongly agree to strongly disagree. The most common activity types and assessment methods were selected by the authors from round one. Items were considered as ‘common’ where three or more participants made the same comment. The participants were given a space on the questionnaire to make any further comments should they feel that the items list was not appropriate or in alignment with comments they had made previously. Participants had to state whether they agreed with the listed assessments and activities by clearly stating “yes” or “no”. Since there were no objections and no comments made indicating the inappropriateness of the listed items, the authors concluded that consensus was reached at the completion of round two. This decision was communicated to all participants, in addition to giving participants a final opportunity to dispute the decision, of which there were none.
The questionnaires in the Delphi process included both qualitative and quantitative aspects. Hsu et al., emphasise that researchers need to find a suitable process to deal with the qualitative information collected.22 In this study, the qualitative data in the form of comments was read together with suggested activities and assessment practices to further understand the reasons for listed items.
For each round in the Delphi study, experts were invited to respond to scale each statement on a Likert-type scale with an option to comment on each statement as desired and finally ranking the statements in the order of importance.25 Quantitative analysis of the Delphi study included calculations of the percentage response rates; percentages for each level of agreement for each statement to compensate for varying response rates; the median, range and their associated group rankings using the importance ratings; mean (SD) and their associated group rankings using the importance ratings. The final results from the Delphi study were reported in percentages to reflect the rate of agreement between experts. Two nominal categories were formed to report the data from the Likert scale used by the authors. Strongly Agree and Agree were combined and Disagree and Strongly Disagree were combined for the purposes of reporting the findings. Green14 suggests that at least 70% of the Delphi participants need to rate three points or higher on a four-point Likert-type scale to reach consensus on subject matter.
Hasson et al., states that the Delphi technique is based upon the assumption of safety in numbers (i.e. many experts are less likely to arrive at a wrong decision than a single person).26 Choices are then strengthened by logical argument in which assumptions are confronted, thus helping to increase validity. Threats to validity arise primarily from pressures for convergence of predictions which challenges the Delphi's forecasting capability. However, the use of experts on a particular topic, can assist to increase the content validity of the Delphi technique by using successive rounds of the questionnaire which increases the concurrent validity.26
Trustworthiness of the data was ensured by using Guba’s11 four criteria of trustworthiness:
i) Credibility
The authors adopted appropriate, well-recognised research methods, which were familiar to the culture of the participating institution and used random sampling of individuals who served as participants in the study. Triangulation was done by the use of different methods, and different selected participants were used for different phases of the research study in varying contexts. Detailed descriptions were given of the background to the study and member-checks of data collected were done in the Delphi study by allocating numbers to participants and getting them to confirm data.
ii) Transferability
The authors provided background data in the study to establish the specific context and gave a detailed description of the phenomenon in question, to allow comparisons to be made with other/similar institutions.
iii) Dependability
The different methods used in this study allowed for overlap and integration in order to develop an IPE model. In-depth methodological description was given in chapter two, which allows this study to be repeated.
iv) Confirmability
Researcher bias was reduced through triangulation of the data and all assumptions and beliefs of the authors were outlined in each chapter. Shortcomings in the methodology of the study and their likely effects are listed in the final chapter of the study as limitations and an in-depth methodological description is provided so as to allow integrity of the research results which can be scrutinised by experts in the field.
Suggested activities and assessments made by the participants for the IPE core competencies were ranked and captured accordingly (see Table 2).48
Fourteen of the experts participated in this round. The suggestions given for activities and assessment strategies that were common to the majority of participants, were summarised and sent back to the participants for confirmation in the form of round two.48 Participants were requested to rate common suggestions on activities and assessments given as Likert items (see Table 3 and Table 4) and make necessary comments should there be any discrepancies.
Considering the above-mentioned assortment of activities, it is evident that similar activities can be used to instil more than one competency, for example, case studies that mention interprofessional communication, patient/client/family/community-centred care, role clarification, interprofessional conflict resolution and values/ethics for interprofessional practice. Another example is role play, which can be used to develop the core competencies of role clarification, collaborative leadership, interprofessional conflict resolution, and values/ethics for interprofessional practice. However, when considering such overlap, it could appear repetitive and confusing in nature when designing new IPE activities and curricula. Barr et al. provide some guidance on how to classify different learning activities that are frequently used in IPE.27 They state that using different methods in combination with each other can be very advantageous for students. The classification is as follows and the results are discussed accordingly:
i) Exchange-based learning, e.g. case studies and debates
ii) Action-based learning, e.g. workshops, problem-based learning, collaborative enquiry and continuous quality improvement
iii) Observation-based learning, e.g. joint visits to a patient by students from different professions, shadowing another profession
iv) Simulation-based learning, e.g. role-play, games, skills labs and experiential groups
v) Practice-based, e.g. co-location across professions for placements, out-posting to another profession and interprofessional training wards
vi) E-learning, e.g. reusable learning objects relating to the above
vii) Blended learning, e.g. combining e-learning with face-to-face learning
viii) Received or didactic learning, e.g. lectures.
The following main activities that were highlighted by the expert panel and were common to most of the IPE core competencies; case studies, joint clinical placements, simulations, role plays and workshops/discussions.
Case studies can be considered as a problem-based learning approach and classified under exchange-based learning, according to Barr et al.27 Bonney highlights several advantages of using case studies as a teaching strategy.28 Firstly, case studies improve the development of the higher levels of Bloom’s taxonomy of cognitive learning,29 which moves beyond not only recalling knowledge, but includes analysis, evaluation, and application. Secondly, case studies facilitate interdisciplinary learning and can be used to facilitate connections between specific theory and real-world societal issues and applications. Case studies have the ability to increase student motivation to participate in class activities, which promotes learning and improves performance on assessments.
Students in groups can be presented with a well-structured problem or case study in which they have to work collaboratively in a once-off session of a week or longer duration, depending on the outcomes of the session. This would encourage active learning among team members. A case study lends itself to being open-ended; it allows for realistic problems to be used to stimulate interdisciplinary discussions; promotes critical thinking, learning and participation among students, especially in terms of their ability to view an issue from multiple perspectives and to grasp the practical application of core course concepts.30
Workshops demonstrate modern principles of teaching such as active engagement of the learners. They provide opportunities for the interaction that enables the teachers to connect the material to the context of the learners. They provide an opportunity for group interaction, which is important for trainees who are becoming increasingly isolated in their work.31 When planning workshops, it is suggested that student preparation and attendance should be a requirement, allowing for a greater success of the workshop.32
There are two activities classified under simulation-based activities, role plays and simulations. Simulations provide students of all professions a safe space to interact with each other collaboratively, as well as opportunities for a novice’s eventual transition to becoming an expert. Simulated activities provide students with an opportunity to explore and appreciate the roles of other health professionals. Fowler-Durham and Alden confirm that simulation intends to mimic reality whilst offering a skills-based clinical experience in a safe and secure setting.33 Hovancsek describes the aim of simulation as ‘to replicate some or nearly all of the essential aspects of a clinical situation so that the situation may be more readily understood and managed when it occurs in reality in clinical practice’.34 Russell et al., state that role plays and simulations function as learning tools for teams and groups or individuals, as they can either engage with each other online or face to face.35
Joint clinical placements are a vital part of undergraduate education, allowing students to transform theory into practice by engaging in ‘real-life’ experiences, to strengthen the academic programme content covered at the institution. Koh warns that students who are unable to link theory and practice could possibly be left ‘floundering, lacking in confidence and disenchanted, with some being forced to leave’.36 The core element of a clinical placement is that learning occurs by doing, since problems associated with clients/patients are placed in context and critical thinking can be developed.37
The main suggestions given by the expert panel on assessment methods aligned to the suggested activities are portfolios, reflection and the development of appropriate rubrics which will be discussed below.
Portfolios are ideal as an assessment tool as they allow for critical analysis of their contents, which reflects on a particular student/group/community. They can therefore be considered as multipurpose instruments, since they can be used for assessments, monitoring and planning, reflection, learning, and for personal development.38 Portfolios are known to stimulate reflection, as students are often required to look back on work they have done and analyse what they have or have not achieved and supply reasons for this. Portfolios are often compiled over a long period of time to allow a sufficient interval in which to collect information and to reflect on the knowledge that has been gained from these experiences. Brown defines a portfolio as ‘A private collection of evidence, which demonstrates the continuing acquisition of skills, knowledge, attitudes, understanding and achievements. It is both retrospective and prospective, as well as reflecting the current stage of development and activity of the individual’.39 Students can sort the evidence in their portfolios into sections corresponding to the different competencies to be assessed and use captions to explain what the evidence shows about a specific competency, since many medical curricula are based on competency criteria.38
Sandars states that many assessments include ‘levels of reflection’ and that this hierarchical model is based on the notion of depth of reflection.40 Superficial reflection is considered to occur when there is only a report of events, but deeper reflection includes a ‘stepping back’ from events and actions with evidence of the encounter and possible change to current views and perceptions. This deeper level is equivalent to ‘transformative learning’ taking place. Reflection can be considered as a purposeful critical analysis of knowledge and experience in order to achieve a deeper meaning and understanding of a specific body of information. Reflection cannot be seen in isolation from reflective learning and reflective practice.40 In a study done by Morison et al. on reflection, students felt that learning together in both lectures and on clinical placement allowed them to gain optimum understanding on their own and others’ roles and that the real-world experience helped them to appreciate the importance of teamwork and communication skills.41 Mann et al., confirm that professional competencies can be assessed through reflection and that different levels of reflection should be established for each year level.42
The third assessment method highlighted by the expert panel is the use of rubrics. Rubrics are a good indicator to students of what aspects of their performance will take priority and how marks/percentages will be allocated to specific tasks for assessment purposes. The use of rubrics in assessments offers a means to provide the desired validity in assessing complex competencies, without forfeiting the need for reliability.43,44 When designing rubrics, Reddy strongly suggests that assessors ensure that the scoring criteria reflect the desired core competencies that would suggest success in curriculum design and practice.45 The scoring/rating of rubrics are descriptive scoring schemes that are developed by educators or others (clinicians/supervisors/peers) to guide the analysis of written work or practical work in terms of a process towards students’ efforts.46
Considering how the activities above can instil core competency development and the suggestions given on how to assess the activities, it cannot be successfully adopted without the suggestions given by experts. For novice higher education institutions that are wanting to design an IPE curriculum, the experts in the field of interprofessional education suggested the application of principles such as treating all the health professions as equals, showing mutual respect, valuing differences, working towards common goals, commitment from the faculty including its leadership, alignment of timetables which includes shared curriculum and assessment practices, to name but a few. Core competency development has been hugely guided from international influences but cannot be applied to all contexts without adaptations.
This study used a Delphi approach to identify teaching and assessment strategies that aim to develop interprofessional competencies in undergraduate health care students. Consensus has been reached by an expert panel on learning activities and assessment methods that instill the development of interprofessional core competencies. These identified strategies will form a crucial aspect in developing an IPE curriculum, especially in a South African context. The learning outcomes in such an IPE curriculum need to be clearly outlined and linked to each respective year level of training for health professions training at tertiary institutions. There is growing evidence that intensive approaches to learning are more likely to be connected with higher quality learning outcomes.47 The development of an IPE model that incorporates a curriculum as described above will allow for flexible application of these learning outcomes that are both challenging and reflective of the cognitive level of learning across the learning continuum.
Although the number of experts who were classified as the participants for the study fell into the normal range for Delphi studies, a larger number of participants would have yielded a more enriched data set. The first round of the Delphi study took almost a year to complete as the authors had to send out monthly reminders to participants who had consented to participate in the study.
IPE is an emerging field and the literature is constantly growing with more and more experienced academics and practitioners emerging. In light of the COVID-19 pandemic, many higher education institutions have resorted to online learning and activities for IPE. Thus, it would be worthwhile to do a follow-up survey to get feedback on these activities and assessment strategies which can be added to this study.
Figshare: Consensus on activities and assessment methods that instil interprofessional core competencies (data sets), https://doi.org/10.6084/m9.figshare.17134679.v1.48
This project contains the following underlying data:
• Delphi Responses – Round 1.pdf (Identification of activities and assessments that develop interprofessional core competencies)
• Delphi Confirmation – Round 2.pdf (Confirmation of activities and assessment strategies that develop interprofessional core competencies)
Data are available under the terms of the Creative Commons Zero “No rights reserved” data waiver (CC0 1.0 Public domain dedication).
Figshare: Consensus on activities and assessment methods that instil interprofessional core competencies (appendices), https://doi.org/10.6084/m9.figshare.18771038.v1.49
This project contains the following extended data:
• Invitation Letter.pdf
• Delphi questionnaire – Round 1.pdf
• Delphi questionnaire – Round 2.pdf
• Consent form.pdf
Data are available under the terms of the Creative Commons Attribution 4.0 International license (CC-BY 4.0).
Views | Downloads | |
---|---|---|
F1000Research | - | - |
PubMed Central
Data from PMC are received and updated monthly.
|
- | - |
Is the work clearly and accurately presented and does it cite the current literature?
No
Is the study design appropriate and is the work technically sound?
No
Are sufficient details of methods and analysis provided to allow replication by others?
Partly
If applicable, is the statistical analysis and its interpretation appropriate?
No
Are all the source data underlying the results available to ensure full reproducibility?
Partly
Are the conclusions drawn adequately supported by the results?
No
References
1. Interprofessional Education Collaborative: Core competencies for interprofessional collaborative practice: 2016 update. Interprofessional Education Collaborative. 2016. Reference SourceCompeting Interests: No competing interests were disclosed.
Reviewer Expertise: Dental Education, Interprofessional Education.
Is the work clearly and accurately presented and does it cite the current literature?
No
Is the study design appropriate and is the work technically sound?
Partly
Are sufficient details of methods and analysis provided to allow replication by others?
Partly
If applicable, is the statistical analysis and its interpretation appropriate?
No
Are all the source data underlying the results available to ensure full reproducibility?
Partly
Are the conclusions drawn adequately supported by the results?
Partly
References
1. Meijering J, Kampen J, Tobi H: Quantifying the development of agreement among experts in Delphi studies. Technological Forecasting and Social Change. 2013; 80 (8): 1607-1614 Publisher Full TextCompeting Interests: No competing interests were disclosed.
Reviewer Expertise: Health Professions Education and Nursing education
Alongside their report, reviewers assign a status to the article:
Invited Reviewers | ||
---|---|---|
1 | 2 | |
Version 1 28 Jan 22 |
read | read |
Provide sufficient details of any financial or non-financial competing interests to enable users to assess whether your comments might lead a reasonable person to question your impartiality. Consider the following examples, but note that this is not an exhaustive list:
Sign up for content alerts and receive a weekly or monthly email with all newly published articles
Already registered? Sign in
The email address should be the one you originally registered with F1000.
You registered with F1000 via Google, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Google account password, please click here.
You registered with F1000 via Facebook, so we cannot reset your password.
To sign in, please click here.
If you still need help with your Facebook account password, please click here.
If your email address is registered with us, we will email you instructions to reset your password.
If you think you should have received this email but it has not arrived, please check your spam filters and/or contact for further assistance.
Comments on this article Comments (0)