|
|
 |
|
ORIGINAL ARTICLE |
|
Year : 2018 | Volume
: 1
| Issue : 1 | Page : 28-30 |
|
How common are experimental designs in medical education? Findings from a bibliometric analysis of recent dissertations and theses
Kenneth D Royal1, Jason C.B. Rinaldo2
1 Department of Clinical Sciences, North Carolina State University, Raleigh, NC, USA 2 Office of Assessment, Rawls College of Business, Texas Tech University, Lubbock, TX, USA
Date of Web Publication | 1-Oct-2018 |
Correspondence Address: Dr. Kenneth D Royal Department of Clinical Sciences, North Carolina State University, 1060 William Moore Dr. Raleigh, NC 27607 USA
 Source of Support: None, Conflict of Interest: None  | 6 |
DOI: 10.4103/EHP.EHP_5_18
Background: There has been a recent influx of researchers in the field of medical education coming from medical and health science backgrounds. Researchers from health fields often misunderstand that studies involving experimental designs are relatively rare throughout educational research. Experts in education research note that experimental designs largely are incompatible with educational studies due to various contextual, legal, and ethical issues. Purpose: We sought to investigate the frequency with which experimental designs have been utilized in recent medical education dissertations and theses. Methods: A bibliometric analysis of dissertations and theses completed in the field of medical education between 2011 and 2016. Results: Fewer than 10% of doctoral dissertations and master's theses involved some type of experimental design. Only 6.12% of all dissertation and master's projects involved randomized experiments. Conclusions: Randomized experiments occur only slightly more frequently in medical education than other educational fields.
Keywords: Bibliometrics, dissertations and theses, education research, experiments, medical education, research design, research quality
How to cite this article: Royal KD, Rinaldo JC. How common are experimental designs in medical education? Findings from a bibliometric analysis of recent dissertations and theses. Educ Health Prof 2018;1:28-30 |
How to cite this URL: Royal KD, Rinaldo JC. How common are experimental designs in medical education? Findings from a bibliometric analysis of recent dissertations and theses. Educ Health Prof [serial online] 2018 [cited 2023 Mar 27];1:28-30. Available from: https://www.ehpjournal.com/text.asp?2018/1/1/28/242554 |
Introduction | |  |
In recent years, there has been a significant influx in the number of individuals conducting research in the field of medical education. The vast majority of these researchers come from medical and health science backgrounds and have little to no formal training in education research. These researchers tend to be accustomed to experimental designs, as these research designs are frequently used in drug trials and health-related research. This design is considered to be the most rigorous, least biased, and of the highest quality.[1] Naturally, many of these individuals have called for an increased use of experimental designs and randomized controlled trials in medical education studies. Unfortunately, educational studies often do not allow for the same randomization as can be found in drug or treatment conditions.
Concomitantly, numerous medical education researchers have questioned the quality of medical education research due to what they perceive to be a lack of rigorous research designs. Many of these individuals immediately dismiss any work that does not involve an experimental design. These hegemonistic views about research designs typically are perceived as insulting by researchers with formal training in the field of education. Such attacks on nonexperimental designs are reminiscent of the “Paradigm Wars” in which researchers in the social and behavioral sciences debated the superiority of quantitative versus qualitative methods for decades before eventually agreeing that each type of inquiry has its own merits and appropriateness for use.
Unquestionably, there is a significant difference between research in the medical and health sciences and research in the education sciences.[2] For instance, what is entirely appropriate in medicine may be entirely inappropriate, unethical, impractical, and/or illegal in education (and vice versa).[2] Further, the nature of the variables studied between the sciences and the methods used to handle those challenges often are unique to one's discipline. In fact, experimental designs largely are incompatible with most educational contexts. To illustrate this point, the National Science Foundation's 2015–2016 Survey of Earned Doctorates reported there are 40 subdisciplines within the field of education.[3] In a review of all dissertations completed across these many education subdisciplines, research has noted that <1% involved randomized experiments.[4]
Interestingly, the disconnect between medical/health science researchers and education researchers whose training typically is rooted in the social and behavioral sciences appears to come down to a single, but significant, difference in perspective. That is, medical and health science researchers typically begin their research with an experimental design in mind and then craft a research question that can be answered within an experimental context. Education and other social and behavioral researchers, on the other hand, begin with a particular research question in mind and then identify an appropriate research design to best answer the question(s). These fundamental differences in research norms based on one's disciplinary training tends to create confusion about what constitutes “good” research in interdisciplinary fields such as medical education where the biomedical and social/behavioral sciences converge.
Any ongoing confusion about what constitutes “good” medical education research is harmful for the medical education community and need resolution. For example, many medical education researchers with formal training in education research have expressed frustration with reviewers and editors that insinuate any study that does not utilize some form of experimental design is “weak,” “flawed,” or otherwise incapable of yielding valid results.[2] On the other hand, many researchers with little or no formal training in education research are frustrated by article submissions that do not meet their expectations for a high-quality research design.[2] Thus, some resolution, or dialog toward resolution, is necessary.
To that end, the purpose of this study was to investigate the research designs utilized in recent dissertations and theses in the subject area of medical education. The rationale is that dissertations and theses provide unique insights about research in the field of medical education. For example, dissertations and theses may originate from a variety of academic disciplines, tend to be rather exhaustive works often exceeding a hundred pages, typically involve rigorous research designs, usually are led by a committee chair/mentor who is a content expert in the subject area, and almost always are evaluated by a team of 3–5 (or more) faculty experts, many of whom also are subject matter experts and specialists in research methodology.
Methods | |  |
A bibliometric analysis was conducted using ProQuest Dissertations/Theses, the largest repository for dissertations and theses in the world with >3.8 million works. A search was conducted using the keyword “medical education” in the subject field. This resulted in a total of 147 dissertations and theses. Search parameters included the most recent 5 years, with a specific date range of August 1, 2011 to August 1, 2016. Each dissertation and thesis was reviewed to determine whether an experimental or quasi-experimental design was employed.
Results | |  |
Of the 147 dissertations and theses identified, 14 (9.52%) utilized either an experimental or quasi-experimental design. Of these 14 studies, 11 (78.57%) were dissertations and 3 (21.42%) were theses. With respect to experimental design type, 9 (64.29%) involved randomized experiments, and 5 (35.71%) were quasi-experimental in nature. Of the 9 randomized experimental studies, 7 (77.78%) were dissertations, and 2 (22.22%) were theses.
With respect to degree type, of the 14 experimental or quasi-experimental designs used, 6 (42.9%) were Doctorates in Education (EdD), 4 (28.6%) were Doctorates of Philosophy (PhD), 1 (7.14%) was a Doctorate of Public Health, and 3 (21.42%) were Masters of Science degrees. Finally, with respect to the colleges granting these degrees, 7 (50.00%) were from colleges of education, whereas 7 (50.00%) came from other colleges (e.g., medicine, public health, engineering, and arts and sciences). A full breakdown of results is presented in [Table 1]. | Table 1: Medical education dissertations and theses utilizing experimental or quasi-experimental designs
Click here to view |
Discussion | |  |
Overall, 9.52% of doctoral dissertations and master's theses in the subject area of medical education over the 5-year period involved some type of experimental design. Only 6.12% of these projects involved randomized experiments, a rate that is only slightly higher than a typical dissertation or thesis in the field of education.[4] These results support the notion that research in medical education has far more in common with the methodological conventions of education and the social and behavioral sciences than those of the medical and health sciences.
Of course, some may argue dissertations and theses are subpar in quality when compared to peer-reviewed, published literature. Although there certainly will be some variability in the quality of graduate projects (much like there is variability in quality of peer-reviewed papers), there also is much reason to believe these graduate projects may actually be of superior quality to many peer-reviewed papers published in the medical education literature (e.g., incredibly rigorous requirements, expert faculty oversight and mentoring, and subject matter expert evaluators). Thus, it would be a mistake to dismiss the quality of dissertations and theses because they have yet to undergo peer-review by an academic journal, especially given many of the long-standing concerns relating to academic peer-review (e.g., reviewer shortage and questionable quality reviews).[19],[20] In fact, many fine peer-reviewed publications stem directly from a researcher's graduate project.
Conclusions | |  |
Medical education research greatly benefits from interdisciplinary perspectives, which includes methodological practice and research design. Certainly, researchers are encouraged to pursue experimental designs when ethical, legal, and operational constraints permit. However, results of this study further illustrate that conducting randomized experiments in education is an ambitious, but generally unattainable goal. Thus, it should serve as a reminder that it is important to reassess assumptions about research designs when conducting research in the field of medical education.
Financial support and sponsorship
Nil.
Conflicts of interest
Both authors are editors for Education in the Health Professions. Thus, peer-review was initiated and performed by an independent editor associated with the journal.
References | |  |
1. | Zlowodzki M, Jonsson A, Bhandari M. Common pitfalls in the conduct of clinical research. Med Princ Pract 2006;15:1-8. |
2. | Royal KD, Rinaldo J. There's education, and then there's education in medicine. J Adv Med Educ Prof 2016;4:150-4. |
3. | |
4. | Nave B, Miech EJ, Mosteller F. The role of field trials in evaluating school practices: A rare design. In: Stufflebeam DL, Madaus GF, Kellaghan T, editors. Evaluation Models: Viewpoints on Educational and Human Services Evaluation. Boston: Kluwer Academic Publishers; 2000. p. 145-61. |
5. | Emmert MC. Pilot Test of an Innovative Interprofessional Education Assessment Strategy [dissertation]. USA: University of California – Los Angeles; 2011. |
6. | Gucev GV. Cognitive Task Analysis for Instruction in Single-Injection Ultrasound Guided-Regional Anesthesia [dissertation]. USA: University of Southern California; 2012. |
7. | Miller Juve AK. Reflective Practice and Readiness for Self-directed Learning in Anesthesiology Residents Training in the United States [dissertation]. USA: Portland State University; 2012. |
8. | Best AC. Online Academic Support Peer Groups for Medical Undergraduates [dissertation]. USA: Nova Southeastern University; 2012. |
9. | Hess JS. Residency Education in Preparing Adolescents and Young Adults for Transition to Adult Care: A Mixed Methods Pilot Study [dissertation]. USA: University of South Florida; 2014. |
10. | McCoy L. Virtual Patient Simulations for Medical Education: Increasing Clinical Reasoning Skills through Deliberate Practice [dissertation]. USA: Arizona State University; 2014. |
11. | Buck KS. Impact of Educational Messages on Patient Acceptance of Male Medical Students in OB-GYN Encounters [dissertation]. USA: East Carolina University; 2014. |
12. | Cheung JJ. Preparing for Simulation-Based Education and Training Through Web-Based Learning: The Role of Observational Practice and Educational Networking [thesis]. Canada: University of Toronto; 2014. |
13. | Wee AG. Promoting Oral Cancer Examination to Primary Care Providers at Medical Clinics in Nebraska [dissertation]. USA: University of Nebraska Medical Center; 2014. |
14. | Latham PS. The Impact of Instructional Design in a Case-Based, Computer-Assisted Instruction Module on Learning Liver Pathology in a Medical School Pathology Course [dissertation]. USA: George Washington University; 2015. |
15. | Carney BL. Improving Safe Opioid Prescribing among Internal Medicine Residents using an Observed Structured Clinical Exam (OSCE) Education Tool [thesis]. USA: Boston University; 2015. |
16. | Ford CL. An Evaluation Study of the Biomedical Careers Program at Robert Wood Johnson Medical School [dissertation]. USA: Rutgers University; 2015. |
17. | Varthis S. Students' Perceptions of Blended Learning and its Effectiveness as a Part of Second Year Dental Curriculum [dissertation]. Columbia University; 2016. |
18. | Hill JA. Examining the Time Course of Memory Retention for Medical Gross Anatomy in First Year Medical Students [thesis]. USA: Boston University; 2016. |
19. | Albert AY, Gow JL, Cobra A, Vines TH. Is it becoming harder to secure reviewers for peer review? A test with data from five ecology journals. Res Integr Peer Rev 2016;1:14. |
20. | Hames I. Peer Review and Manuscript Management in Scientific Journals: Guidelines for Good Practice. United Kingdom: Wiley, Chicester; 2007. |
[Table 1]
|