Article Teaching Medical Students Diagnostic Sonography Peter H. Arger, MD, Susan M. Schultz, RDMS, Chandra M. Sehgal, PhD, Theodore W. Cary, Judith Aronchick, MD Objective. The purpose of this pilot project was to train medical students in sonography. Methods. Thirty-three medical students participated in a pilot sonography course, which included exposure to ultrasound physics, knobology of a compact ultrasound scanner, training in scanning and anatomy of the aorta and right kidney, and reading assignments in these areas. Pretraining and posttraining examinations were given in these areas to analyze the degree of knowledge gained by these methods. Results. Nearly all of the medical students increased their basic knowledge of sonography and improved their scanning skills. The improvement was statistically significant in all areas. Conclusions. Training in sonography for medical students could be used as a foundation for later, more specialtyspecific training to improve the overall medical sonography skills for all physicians. Key words: compact scanner; medical students; pilot training project; sonography. Received June 2, 2005, from the Department of Radiology, University of Pennsylvania School of Medicine, Philadelphia, Pennsylvania 19104 USA. Revision requested June 25, 2005. Revised manuscript accepted for publication July 6, 2005. The equipment used in this study was funded by a grant from the Clinical Practices of the University of Pennsylvania Academic Committee, 2004. Address correspondence to Peter H. Arger, MD, Department of Radiology, University of Pennsylvania School of Medicine, 3400 Spruce St, Philadelphia, PA 19104 USA. E-mail: peter.arger@uphs.upenn.edu The use of ultrasound has greatly expanded in clinical medicine over the past 2 decades. More and more physicians worldwide are using this valuable imaging tool in various medical subspecialties. Appropriate education is necessary to evaluate patients and provide the best information pertinent to the patients medical conditions. Most of the education and training in diagnostic sonography in various subspecialty residency and fellowship programs is based on criteria from the various subspecialty residency review committees of the American College of Graduate Medical Training. There is no standardization of training in sonography across the various subspecialties. The ultrasonic medicine practiced by various subspecialties varies in its quality. More and more physicians in practice with minimal training in sonography are using it, again with varying degrees of quality because of the wide variety of training offered (eg, didactic courses of a few days, weekend courses, and equipment manufacturer training of buyers) after their subspecialty training. Furthermore, with the advent of compact ultrasound machines, it seems likely that this expansion of ultrasound use will include house staff and medical students. 2005 by the American Institute of Ultrasound in Medicine J Ultrasound Med 2005; 24:1365 1369 0278-4297/05/$3.50
Teaching Medical Students Diagnostic Sonography This is already beginning to occur at many institutions where house officers with little or no training or experience in sonography are using compact ultrasound machines to help them in their varied clinical needs. Therefore, to ensure that physicians are trained to practice high-quality ultrasonic medicine for their patients best benefits, it seems prudent and sensible to begin training in diagnostic sonography in medical school. This will give medical students a background for future, more comprehensive education in their chosen subspecialty, which will result in their being able to practice high-quality ultrasonic medicine. This article presents the results of a pilot project on training of medical students in sonography using a compact ultrasound scanner. It details a potential medical student training method within the context of a well-organized and highly popular medical school program at the University of Pennsylvania: Radiology 300, which is a long-established course for medical students in radiology that exposes them to all aspects of the subspecialty. Materials and Methods This study was approved by the Institutional Review Board of the University of Pennsylvania. All students signed permission forms to be a part of the study. A subgroup of 33 third- and fourthyear medical students taking the Radiology 300 course at the University of Pennsylvania Medical School volunteered to receive training in sonography. The course was divided into 4 weeks, with each student attending 2 hours per week for a total of 8 hours per student. The four 2-hour sessions were administered as follows. Week 1 Each student was given a copy of the textbook Ultrasound Secrets 1 and a CD with 15 images (8 of the aorta and 7 of the right kidney). These images were acquired by scanning a volunteer model on a SonoSite TITAN system (SonoSite, Inc, Bothell, WA) to illustrate for the students the images they were to obtain for a complete examination of these 2 organs. Then a test with 30 questions (multiple-choice) was given to the students to test their baseline knowledge before doing any reading, reviewing the CD, or receiving any teaching. The test included 5 questions on ultrasound physics, 5 questions on sonographic technique, and 20 questions on clinical diagnostic sonography of the aorta and kidney. Then an experienced and skilled sonographer (S.M.S.) demonstrated on the volunteer model how to scan and what should be seen in the given assigned anatomic evaluation of the aorta and kidney. A knobology demonstration of the SonoSite TITAN compact ultrasound machine was done. The students were asked to read a total of 47 pages assigned from chapters 1 (Physics), 2 (Ultrasound Artifacts), 17 (Kidney), and 40 (Abdominal Aorta) of Ultrasound Secrets. All 30 questions on the written test were taken from these 4 chapters. Week 2 A list of the desired images was given to each student after the same images done on the volunteer as described in week 1 were illustrated again. The students scanned each other under the supervision of a skilled sonologist (P.H.A.). Each student was given 30 minutes to complete the examination. No training was given by the skilled sonologist to any students to test their pretraining skills. The images acquired by the students were saved for comparison with the images to be acquired during week 4. Week 3 Under the supervision of the same skilled sonologist (P.H.A.), each student scanned other students for 30 minutes. No images were saved at this time. This was a session with the skilled sonologist training the students to increase their scanning skills and their knowledge of sonographic anatomy. In addition to their individual scanning time (30 minutes), each student was able to watch and listen to the training sessions of their peers for an added 90 minutes. No effort was made to pair the students during this project. Week 4 Each student was given the exact written test initially given in week 1 as a posttraining test. The posttraining test scores were compared with the pretraining test scores to evaluate improved knowledge. All of the students were given a copy of their initial pretraining written examination, their posttraining written examination, and a copy of the examination with the correct answers indicated. 1366 J Ultrasound Med 2005; 24:1365 1369
Arger et al Then a list of the desired images was given to each student as well as a review showing the images done on the volunteer. Each student was then given 20 minutes to repeat the examination done in week 2 on a fellow student. No effort was made to repeat the examination on the same student scanned during week 3. These images were saved for comparison with the previous untrained images obtained during week 2. The images from the pretraining scanning examination of week 2 and the images from the posttraining scanning examination of week 4 were stored and then compared for improvement by a skilled sonologist (P.H.A.). The students names on the images were masked to the sonologist grading the examinations. Both the pretraining and posttraining scanning examinations were graded on a scale of 0 to 3. The rating system was as follows: 0, wrong area/anatomy not shown; 1, fair/scanned quality fair/less than 50% of anatomy shown; 2, good/scanned quality good/at least 50% or more of anatomy shown; and 3, excellent/scanned quality excellent/all anatomy shown. Each image was graded. The grades of each image were added together and divided by the number of images, and an average overall score was given to the examination. The scores of the pretraining scanning examinations were compared with the scores of the posttraining scanning examinations for each student. A paired Student t test analysis was done on both the results of the written examination and the practical scanning scores to determine the extent and significance of improvement in scanning performance and whether the training was successful. Results Figure 1. Comparison of pretraining and posttraining written test scores. Of the 33 students involved in the project, 22 were male and 11 were female. The range of scores on the initial multiple-choice test was 37% to 63% (average score ± SD, 50.3% ± 7.5%; median, 53%) (Figure 1). The range of scores on imaging the aorta before scanning training was 0.5 to 2.1 (average, 1.15 ± 0.38; median, 1.1) (Figure 2). The range of scores on imaging the kidney before scanning training was 0.4 to 2.4 (average, 1.22 ± 0.37; median, 1.1) (Figure 3). The range of scores on the posttraining multiple choice test was 33% to 100% (average, 76.4% ± 15%; median, 77%) (Figure 1). The range of scores on the posttraining aorta scanning examination was 1.3 to 3.0 (average, 1.94 ± 0.94; median, 1.9) (Figure 2). The range of scores on the posttraining kidney scanning examination was 1 to 2.7 (average, 1.95 ± 0.37; median, 2.0) (Figure 3). Two (0.06%) of the 33 students performed less well on the posttraining multiple-choice test compared with the initial untrained examination. One (0.03%) of the 33 students had reduced performance on the posttraining kidney scanning examination compared with the initial untrained examination (2.4 to 2.3). One (0.03%) of the 33 scored the same on both the pretraining and posttraining kidney examinations (1.7). The data were significant for each analysis. For the written, aorta, and kidney tests, P <.001. Figure 2. Comparison of pretraining and posttraining aorta scan test scores. J Ultrasound Med 2005; 24:1365 1369 1367
Teaching Medical Students Diagnostic Sonography Figure 3. Comparison of pretraining and posttraining kidney scan test scores. Discussion The criteria for sonographic training in various medical subspecialty groups vary with the Residency Review Committee for that subspecialty. Groups with varying criteria for adequate training of physicians using diagnostic ultrasound include the following: American College of Radiology, 2 American Institute of Ultrasound in Medicine, 3 American Urological Association, American College of Emergency Physicians, 4,5 and American College of Cardiology. 6 Because no agreement exists among these organizations as to what constitutes adequate training for a given subspecialty, it would be well for all specialty groups to agree that basic grounding in diagnostic sonography at the medical school level is necessary to ensure that all physicians have a background in diagnostic sonography during medical school to allow additional adequate preparation within a given subspecialty area. 7 11 Therefore, it is important for educators of medical students to be at the forefront in the investigation of what particular medical school training could provide this basic foundation. 12,13 This pilot project examined one possible method. This method plus others could be combined to achieve the goal of adequate basic sonography training for medical students. The method used in this investigation gives a basic introduction to ultrasound physics, technology, machine knobology, scanning experience and technique, and sonographic anatomy provided by a skilled sonographer and sonologist. Expansion of the scope of the scanning experience and technique as well as other organ anatomy could provide a basic grasp of the knowledge needed for a grounding in diagnostic sonography. Ultrasound scanners are relatively inexpensive and portable. As a result, this education can be offered to a large number of students at a reasonable cost, unlike computed tomography or magnetic resonance imaging. Clearly, much more exposure to diagnostic sonography is needed by medical students so that they are able to provide their patients with high-quality examinations. However, it is also clear from our results that the students involved in this project completed it with a better understanding of the anatomy of the aorta and right kidney, a greater appreciation of the skills necessary to provide an excellent ultrasound examination, and a greater appreciation of the need for proper training to enable them to use this imaging modality effectively if necessary in the future. With few exceptions, all the medical students increased both their basic knowledge of ultrasound and improved their scanning skills as users of diagnostic ultrasound. This is shown by the significance of the data analyzed by the Student t test. The 2 students who performed more poorly on the second examinations presumably did so because of a lack of interest or failure to read the assigned material. In conclusion, a basic education in diagnostic sonography for medical students provides a foundation that individual subspecialty groups using ultrasound can build on to advance training according to the needs of that specialty. This would strengthen the depth and scope of sonography training in general and would serve to better ensure that patients receive appropriately performed examinations when needed. References 1. Dogra V, Rubens DJ (eds). Ultrasound Secrets. Philadelphia, PA: Hanley and Belfus; 2004. 2. American College of Radiology. ACR standard for performing and interpreting diagnostic ultrasound examinations: In: Standards. Reston, VA: American College of Radiology; 1996:235 236. 3. American Institute of Ultrasound in Medicine. Training Guidelines for Physicians Who Evaluate and Interpret 1368 J Ultrasound Med 2005; 24:1365 1369
Arger et al Diagnostic Ultrasound Examinations. Laurel, MD: American Institute of Ultrasound in Medicine; 1997. 4. Mateer J, Plummer D, Heller M, et al. Model curriculum for physician training in emergency ultrasonography. Ann Emerg Med 1994; 23:95 102. 5. Lanoix R, Baker WE, Mele JM, Dharmarajan L. Evaluation of an instructional model for emergency ultrasonography. Acad Emerg Med 1998; 5:58 63. 6. Conti CR. The ultrasonic stethoscope: the new instrument in cardiology? Clin Cardiol 2002; 25: 547. 7. Bruce CJ, Montgomery SC, Bailey KR, Tajik J, Seward JB. Utility of hand-carried ultrasound devices used by cardiologists with and without significant echocardiographic experience in the cardiology inpatient and outpatient settings. Am J Cardiol 2002; 90: 1273 1275. 8. Wittich CM, Montgomery SC, Neben MA, et al. Teaching cardiovascular anatomy to medical students by using a handheld ultrasound device. JAMA 2002; 288:1062 1063. 9. Rozycki GS, Shackford SR. Ultrasound: what every trauma surgeon should know. J Trauma 1996; 40: 1 4. 10. Rodney WM, Prislin MD, Orientale E, McConnell M, Hahn RG. Family practice obstetric ultrasound in an urban community health center: birth outcomes and examination accuracy of the initial 227 cases. J Fam Pract 1990; 30:163 168. 11. Hahn RG, Roi LD, Ornstein SM, et al. Obstetric ultrasound training for family physicians: results from a multi-site study. J Fam Pract 1988; 26:553 558. 12. Greenbaum LD. It is time for the sonoscope. J Ultrasound Med 2003; 22:321 322. 13. Filly RA. Is it time for the sonoscope? If so, then let s do it right! J Ultrasound Med 2003; 22:323 325. J Ultrasound Med 2005; 24:1365 1369 1369