1Department of Internal Medicine, Lenox Hill Hospital, North Shore-LIJ Health System, New York, NY, 10075, USA 2Department of Research, Lenox Hill Hospital, North Shore-LIJ Health System, New York, NY, 10075, USA 3Department of Internal Medicine, North Shore-LIJ Health System and Hofstra North Shore LIJ School of Medicine, Great Neck, NY, 11021, USA
OPEN PEER REVIEW
REVIEWER STATUS
Abstract
Background: Recent changes in healthcare delivery have necessitated residency education reform. To adapt to these changes, graduate medical education can adopt a chief resident-led clinical curriculum. Chief residents are ideal clinical instructors, as they are recent graduates who have excelled in their residency programs. To effectively use the limited time available for education, chief residents can implement active learning techniques. We present a chief resident-led, small-group, problem-based curriculum for teaching first-year internal medicine residents, and provide preliminary data supporting the efficacy of this approach. Methods: The seminar consisted of 11 4-week modules. Week 1 was a team-based crossword competition. Weeks 2-4 were small-group, problem-based clinical reasoning sessions taught by chief residents. The program was evaluated via pre- and post-module multiple-choice tests. Resident satisfaction data were collected via self-reported, anonymous surveys. Results: Preliminary results revealed a statistically significant increase from pre-test to post-test score for 9 of the 11 modules. The chest pain, fever, abdominal pain, shock, syncope, jaundice, dizziness, anemia, and acute kidney injury modules achieved statistical significance. Additionally, resident satisfaction surveys show that this teaching approach was an enjoyable experience for our residents. Discussion: Our chief seminar is an evidence-based, clinical reasoning approach for graduate medical education that uses active learning techniques. This is an effective and enjoyable method for educating internal medicine residents. Because of its reproducibility, it can be applied throughout residency education.
The changing landscape of healthcare delivery has made residency education reform a necessity1. In the hospital, residents must navigate an increasingly complicated healthcare system, while having to expeditiously diagnose, treat, and discharge patients. Furthermore, medical knowledge is increasing in both the basic and clinical sciences. This is evidenced by the dramatic rise in the quantity of publications in the medical literature, particularly randomized controlled trials2. To keep pace with these demands, graduate medical education can adopt a chief resident-led clinical curriculum. Chief residents are ideal clinical instructors, as they have the knowledge base of an attending while still understanding the learning needs of residents. To effectively use the limited time available for education, chief residents can implement active learning techniques. Approaches that fall under active learning are: problem-based learning3–6; collaborative group work; and peer instruction7,8. Small group learning is an effective way to apply these techniques9,10. The application of active learning principles to medical education has been increasingly promoted in the medical literature1,11–13. Active learning also allows for the addition of creative modalities, such as concept mapping, games, puzzles, and extrinsic rewards4,9,14,15. We present a chief resident-led, small-group, problem-based curriculum for teaching first-year internal medicine residents, and provide preliminary data supporting the efficacy of this approach.
Program description
Participants included the intern (i.e. postgraduate year 1, PGY-1) class at a large, urban, tertiary-care hospital. The PGY-1 class consisted of 16 preliminary interns and 27 categorical interns, for a total of 43 participants. During any given module, 16 to 18 interns on the general medical floor participated in the teaching sessions and program evaluation. Week one of the seminar included interns, as well as PGY-2 and PGY-3 residents.
Program structure
The chief’s seminar curriculum consisted of eleven, four-week long modules. The module topics were: dyspnea, chest pain, fever, abdominal pain, shock, altered mental status (AMS), syncope, jaundice, dizziness, anemia, and acute kidney injury (AKI). Each four-week long module consisted of four, one-hour weekly sessions. The chest pain and AMS modules were exceptions, having had only three sessions because of scheduling conflicts. Every module began with a competitive session during Week 1. Weeks 2 through 4 were composed of small-group discussions examining module subtopics in greater depth.
Program content
Week 1: The first week of the seminar began with a crossword competition that served as an important tool for engaging and motivating residents. All general medical floor interns and residents were included in this competition. The participating house-staff were divided into approximately 6 groups of 4. Each member of the team was given a module-specific crossword puzzle covering all aspects of the chief complaint (Figure 1). The crossword puzzle was created by the chief residents using a free, downloadable program (http://www.eclipsecrossword.com). Each month a new crossword puzzle was created, with each puzzle taking 1 to 2 hours to prepare. During this hour-long session, team members worked cooperatively while competing with other teams to complete the crossword puzzle. The team that completed the greatest number of crossword puzzle questions, in the shortest amount of time, won the crossword competition for that module. At the end of the session, all of the answers were reviewed with the house-staff. In addition to the motivation garnered inherently by the competition, teams also competed for $40 worth of gift cards. A photograph of the winning team was distributed via email to the Department of Medicine.
Figure 1. Crossword puzzle for dizziness module.
This is an example of a crossword puzzle from the chief seminar module on dizziness16. The questions were targeting high-yield board review topics. Crossword-generating software is readily available via many different websites. We used a free, downloadable program from the following website: http://www.eclipsecrossword.com.
Weeks 2–4: The content of the subtopic weeks varied depending on the module, but the same techniques were employed. The emphasis during these sessions was on the generation of a differential diagnosis based on a clinical reasoning algorithm. By using a problem-based approach (i.e. dizziness, not vertigo), the course created a real-life clinical scenario. Prior to the session, a practical clinical reasoning algorithm was created by the chief residents, with each session taking 2 to 3 hours to prepare. At the start of the session, the interns were separated into two, chief resident-led groups. Each subtopic session (three per module) was initiated by drawing the full concept map (Figure 2). The concept map began with the module title (e.g. dizziness), and then a broad differential diagnosis was determined using an evidence-based clinical algorithm. The algorithm used information gathered from multiple sources, including the history, physical exam, laboratory data, and imaging. For the dizziness module, the initial differential diagnosis was narrowed by using historical questions, then physical exam findings, and lastly by asking targeted questions. Once the concept map was fully developed, a more detailed discussion of one diagnostic pathway (e.g. vertigo) commenced. For each subtopic session, a different diagnostic pathway was discussed in detail. By the end of the module, the intern had an understanding of each diagnosis, as well as an understanding of how the final, targeted diagnosis was derived from the initial, broad differential. The concept map could be handwritten or created via an internet-based program (https://bubbl.us/).
Figure 2. Concept map for dizziness module.
This is an example of a completed concept map from the chief seminar module on dizziness16. The concept map starts at the top and progresses downward as more information is collected via intern participation and chief resident guidance. Many online concept mapping programs are available. For this seminar, we used the following website: https://bubbl.us/.
Methods
To evaluate the efficacy of our didactic curriculum, we collected pre- and post-course five-question multiple choice tests for each of our modules. Informed consent forms were distributed to all participants and IRB approval for exemption was obtained from Lenox Hill Hospital, North-Shore LIJ (IRB#: 13-045A). Prior to beginning a module, each intern received a unique identifier. This number was used to link the pre- and post-test for each intern participating in the module. The pre-test was given prior to the week 1 crossword competition. The post-test was given after the week 4 teaching session. On completion of a module, pre- and post-test data were entered into a secure, anonymous database according to each unique identifier. Using SPSS Version 20 (IBM SPSS, Chicago IL), data for each module were analyzed via the Wilcoxon Signed-Rank Test. The pre-test and post-test were identical, in order to control for inter-test variability.
A six-item resident satisfaction survey was also distributed for each module (Figure 3). This survey was completely anonymous, and the data collected were descriptive in nature. Survey questions focused on resident satisfaction with the content and style of the module, as well as the perceived effectiveness of the crossword puzzle and small group sessions.
Figure 3. Evaluation form for dizziness module.
This is the satisfaction survey evaluation form we used for the dizziness module. We evaluated 6 parameters that were aimed at gaining an overall impression of intern satisfaction with each module.
Results
Preliminary efficacy results
Efficacy results were obtained for all eleven modules (Figure 4). Results showed a statistically significant increase from pre-test to post-test score for 9 of the 11 completed modules. Chest pain, fever, abdominal pain, shock, syncope, jaundice, dizziness, anemia, and AKI achieved statistical significance, while the first module, dyspnea, had a trend towards statistical significance. Additionally, the AMS module had, to a lesser degree, a trend towards significance.
Figure 4. Pre- and post-test scores by module.
This figure evaluates the change in score from pre-test to post-test for each of the chief seminar modules. The x-axis lists each module with the number of participants who took both the pre- and post-test. The y-axis has the average participant score (0 to 5). Each module was evaluated for a significant change between pre- and post-test and the p-value can be found at the top of each module’s bar chart.
Satisfaction survey results
Intern satisfaction results were obtained for all eleven modules (Figure 5). Survey results were aggregated by survey parameter for each module. Each response was given a numeric code: Strongly Agree (2 points), Agree (1 point), Neither (0 points), Disagree (-1 point), and Strongly Disagree (-2 points). The aggregate results were then weighted according to the numeric code for each response, and averaged. Of the six survey parameters, five (content, style, chief resident effectiveness, improved ability to diagnose, and improved ability to treat) were at or above the “agree” response for all eleven modules. Only the survey question concerning the crossword puzzle had any responses below “agree”.
Figure 5. Resident satisfaction survey results by parameter and module.
This cluster chart displays the satisfaction survey results for each satisfaction parameter by module. The x-axis lists each survey question and the y-axis lists the level of resident satisfaction, with “2” representing strong satisfaction and “-2” representing strong dissatisfaction.
Discussion
The primary goal of our study was to present a reproducible, chief resident-led, teaching curriculum that applies active learning principles to a clinical reasoning seminar for interns. The interns who participated in our study were focused and engaged during our sessions and their positive satisfaction survey results reflect this. We have described our curriculum in detail with the hope that it can be replicated by other residency programs.
The second goal of our study was to present preliminary data evaluating the efficacy of this teaching modality. We have shown a statistically significant increase in test scores for 9 of our 11 modules. For these 9 modules, the interns showed retention of clinical reasoning techniques. Coupled with positive satisfaction surveys, we conclude that this curriculum is both effective and desirable. The remaining two modules, dyspnea and AMS, trended towards, but did not achieve, statistical significance. Dyspnea was our first module, and we were not surprised with the lack of statistical significance. Less clear was why AMS did not reach statistical significance. The broad nature of this topic, combined with one less teaching session, likely contributed to the decreased efficacy outcome for this module. It is also important to note that the number of participants in any of these sessions was low, and if there were more participants, statistical significance would have likely been achieved.
Several limitations of our study have been identified. First, the lack of a comparison group does not allow us to conclude that our teaching curriculum was more effective than a traditional teaching approach. Being a pilot study, the first step was to show that this was an effective teaching method. Subsequent research can build on our preliminary findings by directly comparing this novel curriculum to a traditional, purely lecture-based curriculum. Another limitation was the potential for a practice effect, where the answers to the pre-test are remembered for the post-test. This is most pronounced with a short interval between pre- and post-testing. We did not review any pre-test answers with the participants, and our tests were separated by one month, which was likely sufficient to limit this bias. Another possibility is that the interns searched for the answers to the pre-test before they completed the post-test. We find this unlikely since the post-test scores never approached 100% accuracy. Lastly, our study did not examine long-term retention. This requires the post-test to be repeated at a longer interval, which will be an area of future research.
Despite these, relatively minor, limitations, we have shown that this chief resident-led, evidence-based clinical reasoning approach to graduate medical education is effective and enjoyable for internal medicine residents and we feel it should be applied and adapted throughout residency education.
C.D. and V.G.: responsible for the program conception, design, and execution. Responsible for data collection and storage. Primary authors of manuscript responsible for writing, editing and figure creation.
G.P.: responsible for all statistical analysis and interpretation. Edited manuscript.
K.J.: responsible for the program conception and design. Responsible for writing and editing manuscript.
Competing interests
No competing interests were disclosed.
Grant information
The author(s) declared that no grants were involved in supporting this work.
Acknowledgments
The authors would like to thank the 2012–2013 intern class at our hospital for participating in this teaching curriculum, as well as all the residents that participated in our crossword competitions. Additionally, we would like to thank our chairman, Dr. Jack Ansell, for giving us the opportunity to implement a creative approach to medical education.
We presented an earlier version of this manuscript as a poster at the Association of Program Directors in Internal Medicine (APDIM) annual meeting in Orlando, Florida, in 2013, and as an oral presentation at the Northeastern Group on Educational Affairs (NEGEA) annual retreat in New York City, in 2013.
Small monetary rewards for the participants came from the internal medicine department at our hospital.
Faculty Opinions recommended
References
1.
Lee E, Lazarus ME, El-Farra N:
An updated focus on internal medicine resident education.
Am J Med.
2012; 125(11): 1140–3. PubMed Abstract
| Publisher Full Text
2.
Druss BG, Marcus SC:
Growth and decentralization of the medical literature: implications for evidence-based medicine.
J Med Libr Assoc.
2005; 93(4): 499–501. PubMed Abstract
| Free Full Text
3.
Hmelo-Silver CE:
Problem-based learning: What and how do students learn?
Educ Psychol Rev.
2004; 16(3): 235–66. Publisher Full Text
4.
Kinkade S:
A snapshot of the status of problem-based learning in U.S. medical schools, 2003–04.
Acad Med.
2005; 80(3): 300–1. PubMed Abstract
5.
Neville AJ:
Problem-based learning and medical education forty years on. A review of its effects on knowledge and clinical performance.
Med Princ Pract.
2009; 18(1): 1–9. PubMed Abstract
| Publisher Full Text
6.
Koh GC, Khoo HE, Wong ML, et al.:
The effects of problem-based learning during medical school on physician competency: A systematic review.
CMAJ.
2008; 178(1): 34–41. PubMed Abstract
| Publisher Full Text
| Free Full Text
7.
Weberschock TB, Ginn TC, Reinhold J, et al.:
Change in knowledge and skills of Year 3 undergraduates in evidence-based medicine seminars.
Med Educ.
2005; 39(7): 665–71. PubMed Abstract
| Publisher Full Text
9.
Cook DA:
Modern learning principles. In: Williams FK, Colbert C, Costa ST, et al. editors. A Textbook for Today’s Chief Medical Resident. 20th ed. Alexandria, VA: Association of Program Directors in Internal Medicine; 2012. p. 73–84.
12.
Fitzgibbons JP, Bordley DR, Berkowitz LR, et al.:
Association of Program Directors in Internal Medicine. Redesigning residency education in internal medicine: a position paper from the Association of Program Directors in Internal Medicine.
Ann Intern Med.
2006; 144(12): 920–6. PubMed Abstract
| Publisher Full Text
13.
Prober CG, Heath C:
Lecture halls without lectures – a proposal for medical education.
N Engl J Med.
2012; 366(18): 1657–9. PubMed Abstract
| Publisher Full Text
14.
Calderon KR, Vij RS, Mattana J, et al.:
Innovative teaching tools in nephrology.
Kidney Int.
2011; 79(8): 797–9. PubMed Abstract
| Publisher Full Text
15.
Rondon H, Jhaveri KD:
Nephrology crossword: Divalents, a journey through the nephron.
Kidney Int.
2012; 82(4): 500–1. Publisher Full Text
16.
Post RE, Dickerson LM:
Dizziness: a diagnostic approach.
Am Fam Physician.
2010; 82(4): 361–8, 369. PubMed Abstract
17.
Dittus C, Grover V, Panagopoulos G, et al.:
Chief seminar datasets for pre-/post-testing and satisfaction surveys.
figshare.
2014. Data Source
1
Department of Internal Medicine, Lenox Hill Hospital, North Shore-LIJ Health System, New York, NY, 10075, USA 2
Department of Research, Lenox Hill Hospital, North Shore-LIJ Health System, New York, NY, 10075, USA 3
Department of Internal Medicine, North Shore-LIJ Health System and Hofstra North Shore LIJ School of Medicine, Great Neck, NY, 11021, USA
Dittus C, Grover V, Panagopoulos G and Jhaveri K. Chief’s seminar: turning interns into clinicians [version 1; peer review: 2 approved]. F1000Research 2014, 3:213 (https://doi.org/10.12688/f1000research.5221.1)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.
Share
Open Peer Review
Current Reviewer Status:
?
Key to Reviewer Statuses
VIEWHIDE
ApprovedThe paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations
A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approvedFundamental flaws in the paper seriously undermine the findings and conclusions
Internship is a time of rapid professional and personal growth. Instruction which meets the practical needs of the learners and delivered in ways concordant with the practice environment can be more likely to viewed as successful. This manuscript describes a
... Continue reading
Internship is a time of rapid professional and personal growth. Instruction which meets the practical needs of the learners and delivered in ways concordant with the practice environment can be more likely to viewed as successful. This manuscript describes a year long program (11 modules of 4 weeks each) developed for interns to address common clinical situations and diagnoses.
Overall, this was a well thought out and delivered program. The authors sought ways to engage the learners including basing the instruction on active learning techniques.
Title: Turning interns into clinicians is the goal, but the program only brings the interns part of the way to that goal. The title may overstate the scope of the program; however, the authors note this was a pilot program.
Abstract: Provides an adequate summary.
Study design, methods and analysis are appropriate. The goals of the study as stated were to deveop a reproducable chief led program that applied active learning principles to a clinical reasoning seminar (done by survey). A secondary goal was to present preliminary data to evaluate efficacy (done by pre-post scores).
Discussion and conclusions. The authors appropriately discuss the findings, limitations and conclusions; the curriculum was well received and post test scores improved for most of the modules.
Replication: the authors have supplied their surveys, and example of the crossword, the topics and the format within the residency program to allow for replication of the process of the program.
Comment: to follow this study on to further assess the process of turning interns into clinicians, it would be of interest to assess their clinical performance. One approach could be a review of their patients (inpatient and outpatient) for diagnoses addressed and patient outcomes. Another approach is development of an Entrustable Professional Activity around these key areas, and to assess housestaff performance in the relevant venue (eg: shock in the ICU or jaundice in the hospital). I liked the use of the concept map that the learners developed for each module: one way to potentially assess that is to select cases for case conference or morning report, had have the interns use their concept map in the discussion of that case.
Competing Interests: No competing interests were disclosed.
I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.
This is a well-written informative and easily understood report on an observational study of a pilot educational program for teaching 1st-year interns, by chief residents. It shows a number of positive results. It would be interesting to also get feedback from
... Continue reading
This is a well-written informative and easily understood report on an observational study of a pilot educational program for teaching 1st-year interns, by chief residents. It shows a number of positive results. It would be interesting to also get feedback from the chief residents for each module; and information on ‘what they would change’ (if anything), particularly as they develop the materials according to a framework. Some of the limitations of the pilot are clearly identified, which is very useful.
Background, line 11: a word is missing – should read ‘..knowledge base of an attending clinician, while…’
Competing Interests: No competing interests were disclosed.
I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations -
A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions
Adjust parameters to alter display
View on desktop for interactive features
Includes Interactive Elements
View on desktop for interactive features
Competing Interests Policy
Provide sufficient details of any financial or non-financial competing interests to enable users to assess whether your comments might lead a reasonable person to question your impartiality. Consider the following examples, but note that this is not an exhaustive list:
Examples of 'Non-Financial Competing Interests'
Within the past 4 years, you have held joint grants, published or collaborated with any of the authors of the selected paper.
You have a close personal relationship (e.g. parent, spouse, sibling, or domestic partner) with any of the authors.
You are a close professional associate of any of the authors (e.g. scientific mentor, recent student).
You work at the same institute as any of the authors.
You hope/expect to benefit (e.g. favour or employment) as a result of your submission.
You are an Editor for the journal in which the article is published.
Examples of 'Financial Competing Interests'
You expect to receive, or in the past 4 years have received, any of the following from any commercial organisation that may gain financially from your submission: a salary, fees, funding, reimbursements.
You expect to receive, or in the past 4 years have received, shared grant support or other funding with any of the authors.
You hold, or are currently applying for, any patents or significant stocks/shares relating to the subject matter of the paper you are commenting on.
Stay Updated
Sign up for content alerts and receive a weekly or monthly email with all newly published articles
Comments on this article Comments (0)