Skip to main content

Advertisement

A Delphi consensus study for teaching “Basic Trauma Management” to third-year medical students

Article metrics

Abstract

Background

The Basic-Trauma Management (BTM) course has been taught to third-year medical students in small groups for many years without substantial changes. With the introduction of a new curriculum for Swiss medical students, it was necessary to revise the BTM content and re-align it. Our aim was to identify core competencies for the revised BTM course.

Methods

We applied a three-round step-wise Delphi consensus. First, we asked open-ended questions on what were the most important competencies to be taught for BTM; the second round used Likert scales to ensure agreement on the competencies; and the final round reached out for consensus on these BTM competencies. Stakeholders were selected based on their long-standing experience in teaching BTM and in managing trauma patients.

Results

Consensus was found on 29 competencies out of an initial 130 proposals. “Human Factors”, which had not been taught previously, scored relatively high, at 22%. The sole specific trauma skill agreed upon was the use of tourniquets.

Conclusions

This is an example of curricular revision of a clinical skills course after the introduction of a regulatory framework for undergraduate medical education. The revised course curriculum tailors the concepts and skills in trauma that fulfill stakeholder needs, and are in agreement with the new Swiss learning outcomes.

Introduction

Competency-based medical education (CBME) is receiving increasing attention worldwide due to societal concerns about the current role of physicians [1,2,3,4,5]. The overarching goal of CBME is to better train and prepare medical students for their medical practice, and to improve patient care [5]. A competency framework has been proposed and guidelines have been developed for undergraduate competency-based medical education [6, 7]. All Swiss medical schools are required to base their undergraduate curricula on a well-defined set of competencies, the so-called “Principal Relevant Objectives and Framework for Integrated Learning and Education in Switzerland” –“PROFILES” [8].. PROFILES is influenced by the CanMEDS 2015 Framework [1] and the Dutch Framework for Undergraduate Medical Education [9], both known frameworks for CBME.

Over the last decade, competency-based curricula have been introduced worldwide in undergraduate medical education. This represents a “shift from the traditional focus on teaching and instruction”, which is also called teacher-centered teaching, to a “learning paradigm that enables students to construct knowledge for themselves”; i.e., student-centered learning [10].

Such changes in educational thinking provide an opportunity to reconsider approaches to undergraduate medical education, although at the same time they can present difficulties as they move beyond routine curricular renewal [11]. These include discussions about the learning of competencies beyond the “Medical Expert” domain. Which outcomes are expected at different stages of student development? Which teaching strategy or method might best achieve the proposed learning outcomes? [3, 11,12,13] Additionally, educating medical students in complex subspecialties can be challenging, and the optimal timing and content remain unknown [14].

Research is scarce with regard to the process of developing a competency-based undergraduate subspecialty course based on a given framework [15]. Little is known on how the curriculum revision works: who was involved or what pathways were followed to ensure alignment? Although the concept of CBME is not new, its application is still unfamiliar to many medical university faculty members.

The mandatory Basic Trauma Management (BTM) course at the Medical Faculty of the University of Bern in Switzerland, is such an example. This course has been taught for over 15 years to third-year students, exclusively in small groups in a face-to-face 4-h format. It consists of an introductory lesson about trauma management (1 h), and then students are split up into small groups; clinical cases are then discussed, and skills are practiced (3 h). Students receive a “BTM Course-Manual” with facts and skills descriptions for trauma care prior to the start of the course, as their preparation material.

The BTM course was in alignment with the Swiss Catalogue of Learning Objectives for Undergraduate Medical Training (SCLO) [16], which was first issued in 2008. The SCLO focused on knowledge and did not facilitate the acquisition of core practical skills. The 2017, the published Swiss framework, PROFILES [8], defined entrustable professional activities (EPAs), which medical students should perform at the end of their studies. This mandated the revision of the BTM course content. Additionally, unwarranted practice variations had been noted while teaching the BTM course, because tutors devoted different times to lecturing, which resulted in less time for the intended skills training. Finally, these teaching activities at the University of Bern have not been assessed, and concerns have been raised regarding student motivation.

Curriculum revision classically starts with a needs assessment, defined as “the systematic set of procedures undertaken for making decisions about program improvements” [17], which aims to collect data and to narrow the gap between current and recommended practice. On the one hand, the introduction of the Swiss PROFILES represents a legislative need. On the other hand, clinical teachers of BTM expressed needs regarding motivation and the updating of content. Finally, there are normative needs, to diminish unwarranted teaching practice variations. However, the student needs for trauma are unknown. All in all, the educational needs of several stakeholders had to be addressed, and the corresponding competencies appropriate to third-year medical students needed to be explored.

The aim of this study was to find out which competencies are to be taught for the BTM course, and to use the Delphi method to develop a core curriculum for the BTM course at the University of Bern. Our study might be considered as a generalizable example of how to adapt a medical undergraduate curriculum driven by a new regulatory framework.

Methods

Study design and setting

We used a three-round modified Delphi technique, with the aim to establish the expected competencies of third-year medical students participating in the BTM course. The Delphi technique allows easy curriculum revision, as investigators can work at a distance with a variety of target group representatives [18, 19], and it provides opinions from a broad range of experts to be consolidated into a manageable number of precise statements. This technique defines that “pooled intelligence” captures the collective opinion of stakeholders [20]. Briefly described, stakeholders answer several rounds of questionnaires, after which an external facilitator provides a summary of the forecasts. In this way, stakeholders can revise their former answers in light of the replies of others, with the chance that the group will converge towards a “consensus” [21].

Hypothesis and research question

Our a-priori hypothesis was that different stakeholders would have different perspectives on the importance of different topics in BTM. Following this concept, our research question asked “Which trauma topics should be addressed in a basic trauma management course for third-year medical students?”

Data collection and management

We followed thre Garavalia method for the Delphi technique [22]. In the first round, we asked open-ended questions with the scope of prioritizing the most important teaching topics for the BTM course regarding knowledge, skills, and attitudes: “What should be the priorities of the course?” Participating stakeholders were asked to list up to nine “most relevant” items on knowledge (3), skills (3), and attitudes (3) that the third-year medical students should learn in the BTM course. All of the participating stakeholders were invited by e-mail to answer the questionnaire. An e-mail reminder was sent 10 days after the initial invitation.

To ensure a high-quality survey instrument, all of the rounds of questionnaires were developed iteratively by consultation and feedback. The online version was pilot tested with two German-speaking stakeholders (SN, RG) to confirm the comprehensibility of the questionnaire and the usefulness of the response options.

After completion of the first round, the facilitator (JBE) read all of the answers to the open questions, edited, and merged similar answers, and grouped them into categories, to compile the second-round questionnaire.

The round 2 response format was a five-point Likert scale: 1, strongly disagree; 2, moderately disagree; 3, neither agree nor disagree; 4, moderately agree; and 5, strongly agree. This online questionnaire was pilot-tested for ease of completion and technical functioning. At the end of round 2, a final item list was developed. All of the stakeholders who participated in round 1 were invited by e-mail to the second online survey to rate each statement.

After the second round, we calculated the median and interquartile range (IQR) for each statement, as well as the percentage of agreement. by adding together the 4 (Agree) and 5 (Strongly Agree) responses, and checking their proportionate part of the total answers for each given question. The predefined cut-off for consensus was 75% agreement and a median score of 5. Items with agreement and consensus were added to the final list. Items with disagreement (median ≤ 3) were excluded.

In round 3, the stakeholders were given the median ratings from round 2 and the levels of agreement for each statement. All items without consensus had to be re-rated. Items included in the questionnaire were again re-piloted, and final edits were made based on the feedback received. In round 3, participants could only answer “yes” or “no”, to decide whether or not the remaining items should be included in the final competence list. Competencies with 75% or more of stakeholder agreement were selected. The pre-final list was sent again to all of the stakeholders to be commented on and signed (Fig. 1).

Fig. 1
figure1

Delphi method flowchart

Study participants

Stakeholders included a selection of (i) BTM teachers; (ii) certified in-hospital emergency physicians; (iii) final-year medical students who had participated in the BTM course; (iv) out-of-hospital emergency physicians; (v) curriculum designers; and (vi) external educational experts. Selection was based on the long-standing experience of the participants in BTM teaching and their management of trauma patients. We aimed to include 15 stakeholders as participants [20], two to three in each group. A set of 36 invitations to stakeholders was sent out to obtain sufficient participants.

Gap analysis

We performed gap analysis to compare our findings from the survey to pre-selected, trauma-related objectives and EPAs from the PROFILES [8] report. JBE and RG selected eight general objectives (GO 1.5, GO 1.6, GO 1.11, GO 2.2, GO 3.1, GO 3.2, GO 7.1, GO 7.2) and twenty-two EPAs (EPA 1.1, EPA 1.3, EPA 1.5, EPA 2.1, EPA 2.2, EPA 2.3, EPA 2.4, EPA 2.6, EPA 2.7, EPA 3.2, EPA 5.1, EPA 5.2, EPA 5.3, EPA 5,4, EPA 5.5, EPA 6.1, EPA 6.2, EPA 6.3, EPA 6.5, EPA 6.8, EPA 9.2) that might be covered in the BTM course.

External review

We used Penciner’s advice (2011) to externally review our results upon completion of the data handling. Three trauma management experts were selected as external reviewers to provide brief comments about the validity and usefulness of our methodology and results. Together with the study investigators, these external reviewers compared the list from round 3 with the new Swiss PROFILES, to assure consistency with the SWISS EPAs for undergraduate medical students and the new list of BTM competencies.

Data handling

A descriptive analysis of each questionnaire result was conducted. Data from the consecutive rounds were stored to fulfill the requirements of the Swiss Research Act on the Departmental research server LabKey (LabKey Software, Seattle, USA), which was accessible only to the investigators through a personalized passwords. We followed the Guidance on Conducting and Reporting Delphi Studies (CREDES) [23].

Results

The data were collected between the 1 October 2018 and 28 February 2019. Round 1 took 30 days and enrolled 18 participants out of the 36 invited (response rate 50%). The group description and participation rate are given in Table 1. Round 2 took 10 days, and round 3 took 14 days. There were no drop-outs after enrollment.

Table 1 Participation rate during the first round of the Delphi method

First round results

The participants listed 47 priorities, 28 knowledge items, 30 skills, and 25 attitudes. These were organized into a framework that included nine domains of BTM, with the aim to compile the questionnaire for round 2; this ending up with 85 items.

These items were coded according to the following competencies: “triage” (7.4% of answers); “structured approach to trauma” (9.4%); “general trauma management” (15.3%); “technical skills” (23.5%); “particular trauma management” (15.3%); “transport” (3.5%); “human factors” (22.4%); “security issues” (4.7%): and “knowledge” (1.2% representation of all answers).

Second round results

The second round response rate was 100%. The median and percentage of agreement for each item is shown in Table 2. Items with > 75% agreement and a median of 5 were accepted as consensual and did not enter round 3. Items with a median ≤ 3 and overlapping with subjects of other third-year courses were excluded.

Table 2 Competencies from Delphi round 2

The overall agreement in round 2 was 87%. The overall agreement in “triage” was low (52%). No consensus was reached in 25% of the items, which resulted in their exclusion. “Structured approach” had high overall agreement (91%) and a consensus of 75%. “General management” had an overall agreement of 75% (high), with 23% of the items excluded and 46% consensus. “Technical skills” had moderate overall agreement (73%), with 10% item exclusion and 15% acceptance. “Specific management” showed low overall agreement (51%), with an item exclusion rate of 53%, and no consensus. “Transport” had low overall agreement (63%), and no item exclusion or agreement. “Human Factors” had high overall agreement (75%), with 11% of items excluded and 16% consensus. “Security” had 86% overall agreement and all of the items reached consensus. “Knowledge” excluded only one item due to misunderstandings in the phrasing. From the 85 items in round 2, only 44 showed disagreement and were taken up in round 3.

Third-round results

All eighteen stakeholders assessed the 44 items for inclusion in the final curriculum (100% response rate). All of the competencies with an agreement of 75% or more were selected for the final listing. This round reached consensus for 20 items (45%). Overall agreement was 76%. “Triage”, “Structured Approach”, and “Transport” did not reach agreement. Higher agreement was reached for “General Management” (75%), “Human Factors” (64%), and “Technical Skills” (57%). Students should not be taught advanced airway management during the BTM course had 72% agreement, but they should be able to perform bag-mask ventilation correctly when deemed necessary (66.7% agreement). Table 3 shows the list of all of the included items. After merging the redundant items, we ended up with a list of 29 items to be included in the BTM course (Table 4).

Table 3 Items included by consensus from rounds 2 and 3 of the Delphi method and the subsequent merging and editing of competencies
Table 4 Final items to include in the BTM course for third-year medical students of the University of Bern

External reviewing

External reviewers made comments on different aspects of the project: validity, applicability, usefulness of results, and adequacy of methodology for curriculum development. All of the experts mentioned the adequacy of panel selection for enhancing face validity. Reviewers also pointed out that the methodology involved was adequate to inform on competencies and curriculum development. However, our results were only considered applicable to the local standard of practice, because of the low response rate of the external sources.

The reviewers commented that our findings were useful because the mapping against the PROFILES report included a high percentage of items. Comparing the final list against the EPAs in PROFILES revealed agreement in 82% of all items of the new BTM course.

Discussion

This study determined which core competencies are necessary to teach to third-year medical students in BTM based on a stakeholder needs assessment and the requirements of the new Swiss CBME curriculum PROFILES. Our three-round Delphi process involved all course stakeholders and included external reviewers for validation. Twenty-nine competencies were selected out of an initial 130, for the new teaching program for BTM in Bern.

In line with Greenhalgh (2014), we needed an alternative view on evidence-based medicine that emphasizes the value of expert judgement and that is not directly accessible through clinical trials. Our Delphi process allowed all of the rounds to be performed electronically. This cut costs, time, and resources [24]. Additionally, opinions could be expressed anonymously, to avoid peer pressure, as well as promoting new perspectives on the subject.

Our approach was especially helpful because the stakeholders came from different backgrounds and Departments. Performing face-to-face discussions would be very hard to organize, and would be impractical. All in all, the Delphi method was a quick way to achieve solid results. The most important competencies surfaced first, and remained after several rounds of reflection. Less important or not so clearly formulated competencies were systematically excluded. These advantages might explain the extensive use of the Delphi technique in medical education curriculum development [25,26,27,28,29].

Limitations

We faced the usual limitations of the Delphi technique [20]. Participant commitment was substantial, as they needed to complete all three rounds. Our open questions might have discourage stakeholders from answering, and long questionnaires can decrease overall motivation to participate. All this might account for the 50% drop-out rate from the first to the second round of questionnaires. Additionally, there is no clear definition in the literature of what makes an “expert”. Nonetheless, our stakeholders were representatives of the groups that are directly related to BTM education at Bern University. By agreeing to participate, they showed a significant level of interest in the topic. Our panel consisted of 18 members, a number considered to be adequate to a Delphi method [20, 24]. The high response rates after enrollment also increased the validity of the results. Furthermore, the final list of competencies was validated by external reviewers with expertise in trauma medicine, which strengthens our findings.

The Delphi method is considered an effective tool to find “consensus”, although the level at which this “consensus” occurs is difficult to determine. The reported levels of consensus range from 51 to 80% [30], with a trend to higher percentages of agreement [23]. We set, a priori, a median score of 5 and > 75% agreement to accept a statement as consensual. Obviously, the sole determination of a consensus threshold does not mean the “correct” solution has been found [27]. Additionally, the Delphi technique tends to eliminate extreme positions and to force a conservative status [20].

Another limitation is reliability [31]. There is no evidence available to indicate whether two different panels given the same initial information will produce the same results. Therefore, generalizability might be limited by unique stakeholder characteristics, and solutions reached by such Delphi processes are simply a consensus opinion of this group.

The strengths of our study include the following: our approach was simple, easy, and effective in developing curricular adaptation. In this sense, this approach might be applicable to other curricula development, as it allows priorities for a mandatory clinical skills course for undergraduate medical students to be summarized in a short time and with limited resources. The involvement of all stakeholders and the fast turn-around of the three Delphi rounds assured integration of the current needs of teachers, students, and in- and out-of-hospital emergency physicians. The correlation to the legally given new national curriculum for the study of medicine addressed the needs, and fulfilled the responsibility for curriculum realignment.

The PROFILES report was, in our case, inadequate to effectively educate third-year, trauma-naïve, medical students, because PROFILES lists all of the competencies medical students need to have at the end of their training. Adapting this framework for the third-year course was challenging, and this was difficult to “fine tune”. However, the Delphi technique was particularly useful for the adaptation of BTM knowledge, skills and competencies for third-year medical students. Such an adaptation of competencies to a specific student level was evident in round 2 of this study, where the category “Specific Management” had low overall agreement on a variety of skills (50.7%), with 53% item exclusion and no consensus. In round 3, only the management of a tourniquet found agreement. Therefore, we could adapt the given trauma competencies to the third-year level. Our results determine which BTM principles third-year students should be exposed to. This has been done before in emergency medicine curricula [32,33,34,35], but our study uses the Delphi technique for the first time in a BTM curriculum.

We were surprised by the strong vote on the human factor competencies, which had not been addressed before in the BTM course. Our findings represent the expressed need to introduce teaching of non-technical skills beyond the “medical expert” competence. Human factors include a set of social and cognitive abilities that encompass situational awareness, risk assessment, clinical decision making, leadership, communication skills, and teamwork [36]. The influence of these human factors on clinical outcomes has already been ascertained [37, 38]. In the undergraduate setting, however, there is a substantial lack of guidance and teaching for these skills [39, 40]. Our stakeholders underlined the need to teach human factors, which might represent a trend that is already occurring in the postgraduate medical education milieu, as a shift towards a more holistic model of medical education.

Conclusion

The revised BTM course curriculum proposed in this study is an attempt to tailor concepts and skills to fulfill unmet needs. It is an example of curricular adaptation driven by a new regulatory framework, to reform learning outcomes. In an effort to achieve this, a three-step Delphi process that involved all stakeholders of the course finally listed 29 core competencies to be taught to third-year medical students in the BTM course.

Practice points

  • Example of a curricular adaptation based on stakeholders’ needs assessment driven by a new regulatory framework.

Availability of data and materials

The datasets analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

BTM:

Basic Trauma Management

CBME:

Competency-based medical education

EPA:

Entrustable professional activity

IQR:

Interquartile range

PROFILES:

Principal Relevant Objectives and Framework for Integrated Learning and Education in Switzerland

SLCO:

Swiss Catalogue of Learning Objectives for Undergraduate Medical Training

References

  1. 1.

    Frank JR, Snell L, Sherbino J. CanMEDS 2015 Physician Competency Framework. Ottawa: Royal College of Physicians and Surgeons of Canada; 2015.

  2. 2.

    Frank JR, Danoff D. The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med Teach. 2007;29(7):642–7.

  3. 3.

    Frank JR, Snell LS, Cate OT, Holmboe ES, Carraccio C, Swing SR, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638–45.

  4. 4.

    Harden RM. Outcome-based education: the future is today. Med Teach. 2007;29(7):625–9.

  5. 5.

    Harden RM, Crosby JR, Davis MH, Friedman M. AMEE Guide No. 14: Outcome-based education: part 5 - from competency to meta-competency: a model for the specification of learning outcomes. Med Teach. 1999;21(6):546–52.

  6. 6.

    Learning objectives for medical student education--guidelines for medical schools: report I of the Medical School Objectives Project. Acad Med. 1999;74(1):13–8. https://doi.org/10.1097/00001888-199901000-00010. PMID: 9934288.

  7. 7.

    Albanese MA, Mejicano G, Mullan P, Kokotailo P, Gruppen L. Defining characteristics of educational competencies. Med Ed. 2008;42(3):248–55.

  8. 8.

    Michaud P, Jucker-Kupper P. PROFILES; principal objectives and framework for integrated learning and education in Switzerland. Bern: Joint Commission of the Swiss Medical Schools; 2017.

  9. 9.

    van Herwaarden CLA, Laan RFJM, Leunissen R. The 2009 Framework for Undergraduate Medical Education in the Netherlands. Utrecht: Dutch Federation of University Medical Centres; 2009.

  10. 10.

    Maher A. Learning outcomes in higher education: implications for curriculum design and student learning. J Hospital Leisur Sport Tour Ed. 2004;3(2):46–54.

  11. 11.

    Harris P, Snell L, Talbot M, Harden RM. Competency-based medical education: implications for undergraduate programs. Med Teach. 2010;32(8):646–50.

  12. 12.

    Albanese MA, Mejicano G, Anderson WM, Gruppen L. Building a competency-based curriculum: the agony and the ecstasy. Adv Health Sci Ed Theory Prac. 2010;15(3):439–54.

  13. 13.

    Hawkins RE, Welcher CM, Holmboe ES, Kirk LM, Norcini JJ, Simons KB, et al. Implementation of competency-based medical education: are we addressing the concerns and challenges? Med Ed. 2015;49(11):1086–102.

  14. 14.

    Wisniewski WR, Fournier KF, Ling YK, Slack RS, Babiera G, Grubbs EG, et al. A focused curriculum in surgical oncology for the third-year medical students. J Surg Res. 2013;185(2):555–60.

  15. 15.

    Cunningham J, Key E, Capron R. An evaluation of competency-based education programs: a study of the development process of competency-based programs. J Comp-Based Ed. 2016;1(3):130–9.

  16. 16.

    Working Group under a Mandate of the Joint Commission of the Swiss Medical Schools (SMIFK/CIMS). Bern: Swiss Catalogue of Learning Objectives for Undergraduate Medical Training; 2008. Retrieved from http://www.smifk.ch.

  17. 17.

    Leavy P. The Oxford Handbook of Qualitative Research. Oxford: Oxford University Press; 2014.

  18. 18.

    Atkinson NL, Gold RS. Online research to guide knowledge management planning. Health Ed Res. 2001;16(6):747–63.

  19. 19.

    De Vet E, Brug J, De Nooijer J, Dijkstra A, De Vries NK. Determinants of forward stage transitions: a Delphi study. Health Educ Res. 2005;20(2):195–205.

  20. 20.

    de Villiers MR, de Villiers PJ, Kent AP. The Delphi technique in health sciences education research. Med Teach. 2005;27(7):639–43.

  21. 21.

    Rowe G, Wright G. The Delphi technique as a forecasting tool: issues and analysis. Int J Forecast. 1999;15(4):353–75.

  22. 22.

    Garavalia L, Gredler M. Teaching evaluation through modeling: using the Delphi technique to assess problems in academic programs. Am J Eval. 2004;25(3):375–80.

  23. 23.

    Junger S, Payne SA, Brine J, Radbruch L, Brearley SG. Guidance on conducting and reporting Delphi studies (CREDES) in palliative care: recommendations based on a methodological systematic review. Palliat Med. 2017;31(8):684–706.

  24. 24.

    Fink A, Kosecoff J, Chassin M, Brook RH. Consensus methods: characteristics and guidelines for use. Am J Publ Health. 1984;74(9):979–83.

  25. 25.

    Stritter FT, Tresolini CP, Reeb KG. The Delphi technique in curriculum development. Teach Learn Med. 1994;6(2):136–41.

  26. 26.

    Edgren G. Developing a competence-based core curriculum in biomedical laboratory science: a Delphi study. Med Teach. 2006;28(5):409–17.

  27. 27.

    Rohan D, Ahern S, Walsh K. Defining an anaesthetic curriculum for medical undergraduates. A Delphi study. Med Teach. 2009;31(1):e1–5.

  28. 28.

    Rana J, Sullivan A, Brett M, Weinstein AR, Atkins KM, Sa TDWG. Defining curricular priorities for student-as-teacher programs: a National Delphi Study. Med Teach. 2018;40(3):259–66.

  29. 29.

    Penciner R, Langhan T, Lee R, McEwen J, Woods RA, Bandiera G. Using a Delphi process to establish consensus on emergency medicine clerkship competencies. Med Teach. 2011;33(6):e333–9.

  30. 30.

    Hasson F, Keeney S, McKenna H. Research guidelines for the Delphi survey technique. J Adv Nurs. 2000;32(4):1008–15.

  31. 31.

    Tomasik T. Reliability and validity of the Delphi method in guideline development for family physicians. Qual Prim Care. 2010;18(5):317–26.

  32. 32.

    Tews MC, Wyte CM, Coltman M, Grekin PA, Hiller K, Oyama LC, et al. Developing a third-year emergency medicine medical student curriculum: a syllabus of content. Acad Emerg Med. 2011;18(Suppl 2):S36–40.

  33. 33.

    Tews MC, Ditz Wyte CM, Coltman M, Hiller K, Jung J, Oyama LC, et al. Implementing a third-year emergency medicine medical student curriculum. J Emerg Med. 2015;48(6):732–43 e8.

  34. 34.

    Wald DA, Lin M, Manthey DE, Rogers RL, Zun LS, Christopher T. Emergency medicine in the medical school curriculum. Acad Emerg Med. 2010;17(Suppl 2):S26–30.

  35. 35.

    Manthey DE, Ander DS, Gordon DC, Morrissey T, Sherman SC, Smith MD, et al. Emergency medicine clerkship curriculum: an update and revision. Acad Emerg Med. 2010;17(6):638–43.

  36. 36.

    Gordon M, Baker P, Catchpole K, Darbyshire D, Schocken D. Devising a consensus definition and framework for non-technical skills in healthcare to support educational design: a modified Delphi study. Med Teach. 2015;37(6):572–7.

  37. 37.

    Stewart MA. Effective physician-patient communication and health outcomes: a review. Can Med Assoc J. 1995;152(9):1423–33.

  38. 38.

    Deveugele M, Derese A, De Maesschalck S, Willems S, Van Driel M, De Maeseneer J. Teaching communication skills to medical students, a challenge in the curriculum? Pat Ed Counsel. 2005;58(3):265–70.

  39. 39.

    Nicolaides M, Cardillo L, Theodoulou I, Hanrahan J, Tsoulfas G, Athanasiou T, et al. Developing a novel framework for non-technical skills learning strategies for undergraduates: a systematic review. Ann Med Surg (London). 2018;36:29–40.

  40. 40.

    Gordon M, Farnan J, Grafton-Clarke C, Ahmed R, Gurbutt D, McLachlan J, et al. Non-technical skills assessments in undergraduate medical education: a focused BEME systematic review: BEME Guide no. 54. Med Teach. 2019;41(7):732-45. https://doi.org/10.1080/0142159X.2018.1562166.

Download references

Acknowledgments

We would like to thank Dr. Sonja Lang, Prof. Dr. Lorenz Theiler, and Dr. Phillip Venetz for their outstanding external review of the validity and usefulness of our methodology and results.

Funding

Financed by a Departmental research grant of the Department of Anaesthesiology and Pain Medicine, Bern University Hospital, Bern Switzerland.

Author information

JBE designed the study, acquired the data, and wrote the manuscript; SN reviewed the questionnaires for face validity; RG critically reviewed the manuscript. All authors read and approved the final manuscript.

Authors’ information

JBE: Dr. med., Consultant in Anaesthesiology and Intensive Care, PG Dip Med Ed (Dundee).

SN: Dr. med., Resident in Anaesthesiology, MStudies with a specialization in educational studies (Newcastle, AU), AFAMEE.

RG: Prof. Dr. med., Professor in Anaesthesiology and Intensive Care, MMEd (Bern), FERC.

Correspondence to Joana Berger-Estilita.

Ethics declarations

Ethics approval and consent to participate

This study followed the Helsinki Ethical Principles for Medical Research Involving Human Subjects and complied with the Swiss Human Research Act. The Cantonal Ethics Committee of Bern (KEK Bern) waived the need for ethics approval according to the Swiss Human Research Act (BASEC-number: Req-2018-00715). Written informed consent was required from all participants. All of the questionnaires were accompanied by a cover letter that explained the purpose of each round.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Presentations: Partial results of this study were presented as conference oral presentation entitled “A Delphi consensus study to identify current most valuable knowledge, skills and attitudes for teaching Basic Trauma Management to third year medical students at the University of Bern” at the 2019 Swiss Faculty Development Network Conference on How Research on Learning Contributes to University Teaching Practice, held in Zurich on the 22nd February 2019.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Berger-Estilita, J., Nabecker, S. & Greif, R. A Delphi consensus study for teaching “Basic Trauma Management” to third-year medical students. Scand J Trauma Resusc Emerg Med 27, 91 (2019) doi:10.1186/s13049-019-0675-6

Download citation

Keywords

  • Trauma
  • Teaching
  • Skills
  • Undergraduate
  • Curriculum
  • Delphi