Student Ratings of Selection Factors for PsyD Programs



Similar documents
Graduate Admissions in Psychology: II. Acceptance Rates and Financial Considerations

Applying to Graduate School in Clinical Psychology: Planning Ahead

Program Overview. General Training Approach and Specialization: Both MA and Ph.D programs have four broad training objectives:

A Career in School Psychology: Selecting a Master s, Specialist, or Doctoral Degree Program that Meets Your Needs

Must I Go to Graduate School?

What Is Counseling Psychology?

Appreciating the PsyD: The Facts

DEPARTMENT OF PSYCHOLOGY Ph.D. in Clinical Psychology 90 SEMESTER HOURS

Howard University Clinical Psychology PhD Program Frequently Asked Questions

Graduate Study in Psychology and Related Fields: Focus on Applied Programs

Becoming a Counseling or Clinical Psychologist: Tips for Admission to Graduate School

Review of the M.A., M.S. in Psychology

by John C. Norcross - University of Scranton, Fields of Psychology Graduate School

GRADUATE SCHOOL OPTIONS FOR PSYCHOLOGY MAJORS

A. The Science-Practice Relationship in Professional Psychology and the Shift from a Practitioner Scholar to Practitioner Scientist.

Before I launch into a presentation of the data, I need to mention a few caveats and things to keep in mind as we go through the slides.

1. Overview of Proposed Program

MASTERS SOCIAL WORK PROGRAM ASSESSMENT REPORT

Frequently Asked Questions Howard University APA Accredited Clinical Psychology PhD Program

Draft Policy on Graduate Education

GRADUATE FACULTY COUNCIL DOC. NO. 969 Approved October 18, 2004

Ten Critical Things You Need to Know Before Applying to Graduate School in Psychology

MASTER OF ARTS IN PSYCHOLOGY (208)

ASSESSMENT: Coaching Efficacy As Indicators Of Coach Education Program Needs

PsyD Psychology ( )

Guidelines for Massachusetts Early Educator Preparation Programs Participating in the Early Childhood Educator Scholarships Program.

School Psychology PsyD Program Information for Applicants

YALE CLINICAL PSYCHOLOGY: TRAINING MISSION AND PROGRAM STRUCTURE RESEARCH TRAINING

School Psychology Program Goals, Objectives, & Competencies

Ariel A. Finno Marlene Wicherski Jessica L. Kohout. March 2010

Master of Arts Programs in the Faculty of Social and Behavioral Sciences

Careers in Psychology

SELF-STUDY FORMAT FOR REVIEW OF EXISTING DEGREE PROGRAMS

UNIVERSITY OF MASSACHUSETTS PROCEDURES FOR UNIVERSITY APPROVAL OF NEW ACADEMIC DEGREE PROGRAMS, PROGRAM CHANGES, AND PROGRAM TERMINATION

ATTITUDES OF ILLINOIS AGRISCIENCE STUDENTS AND THEIR PARENTS TOWARD AGRICULTURE AND AGRICULTURAL EDUCATION PROGRAMS

PsyD Programs. George Stricker. HISTORY OF THE PsyD

The Survey of Undergraduate and Graduate Programs in Communication. Sciences and Disorders has been conducted since Surveys were conducted in

BOARD OF REGENTS ACADEMIC AND STUDENT AFFAIRS COMMITTEE 5 STATE OF IOWA MARCH 2, 2016

Ph.D. Counselor Education and Supervision Program Guidebook

Master of Arts in Psychology

Intent to Plan Master of Arts in Clinical Psychology. West Liberty University

Program Assessment Report. Unit Psychology Program name: Clinical Psychology MA Completed by David Grilly May 2007

Doctor of Psychology (PsyD) Supplemental Program Information

Psychology Courses (PSYCH)

CLINICAL PSYCHOLOGY: WHAT TO EXPECT IN GRAD SCHOOL AND BEYOND

HIGHER EDUCATION PRACTICE RESEARCH ABSTRACTS

Psychology (MA) Program Requirements 36 credits are required for the Master's Degree in Psychology as follows:

Information and Informed Consent to Participate in the Admissions Screening, Evaluation, and Interview Process Conducted by the Admissions Committee

Doctoral Program in Social- Organizational Ps y c hology

IPP Learning Outcomes Report. Faculty member completing template: Greg Kim Ju, Marya Endriga (Date: 1/17/12)

Master of Arts in Higher Education (both concentrations)

Courses in the College of Letters and Sciences PSYCHOLOGY COURSES (840)

OKLAHOMA STATE UNIVERSITY. School of Applied Health and Educational Psychology. Ed.S. in Education Specialist in School Psychology Program

The PsyD: Heterogeneity in Practitioner Training

STANDARDS of ACCREDITATION for HEALTH SERVICE PSYCHOLOGY

Graduate Study in Psychology 2016

FACULTY QUALIFICATIONS AND FACTORS INFLUENCING EMPLOYMENT DECISIONS OF ADMINISTRATORS IN HOSPITALITY EDUCATION. Introduction

University Undergraduate Teaching Quality Chapter 3 Section. Background. Ministry of Training, Colleges and Universities

External Review Fort Valley State University Mental Health Counseling Graduate Program. Reviewer: Dr. Andrew A. Cox

SPECIAL EDUCATION AND DISABILITY STUDIES

Since the 1990s, accountability in higher education has

Department of Psychology

MASTER OF SCIENCE AND DOCTORATE IN CLINICAL PSYCHOLOGY MASTER OF SCIENCE IN COUNSELING PSYCHOLOGY (MARITAL AND FAMILY THERAPY)

Applied Behavior Analysis, M.S.Ed.

Graduate Program Goals Statements School of Social Work College of Education and Human Development

boston college school of social work where transformation happens

GUIDELINES AND PRINCIPLES FOR ACCREDITATION OF PROGRAMS IN PROFESSIONAL PSYCHOLOGY 1G&P APA OFFICE OF PROGRAM CONSULTATION & ACCREDITATION

Counseling Psychology, M.Ed.

School of Behavioral and Applied Sciences. Psychology. Graduate Programs in

Ph.D. Completion Project: Policies and Practices to Promote Student Success

University of Kentucky Undergraduate Bulletin 1

OKLAHOMA STATE UNIVERSITY School of Applied Health and Educational Psychology

Doctor of Philosophy in Counseling Psychology

Psychology Courses (PSYCH)

SCIENTIST-PRACTITIONER INTEREST CHANGES AND COURSE PERFORMANCE IN AN UNDERGRADUATE RESEARCH METHODS PSYCHOLOGY COURSE

Clinical Psychology. James Smolin, Ph.D. Orange Coast College

I. PROGRAM PHILOSOPHY. The mission of the Clinical Psychology Doctoral Program at Jackson State University (JSU) is four-fold:

boston college graduate school of social work where transformation happens

GRADUATE DEGREE REGULATIONS

Commission on Peer Review and Accreditation

Transcription:

Student Ratings of Selection Factors for PsyD Programs Mitchell D. Dornfeld, Sharon Green-Hennessy, Jeffrey Lating, and Matthew Kirkhart Loyola University Maryland Objectives: To explore which factors doctor of psychology (PsyD) students feel are important to consider when selecting a PsyD program. Design: This article analyzes the survey responses of 394 enrolled PsyD students and 17 directors of clinical training (DCTs), in which the respondents rated the importance of 18 factors in program selection to understand what qualities PsyD students and DCTs value in a PsyD program. Students were also asked to assess how their program fared on the same 18 dimensions. Results: Results indicated that participants rated the program s structure, tone, and reputed quality of training as the most important factors in program selection (M s of 4.13 to 4.54 on a 5- point scale). Additionally, students rated their current program as high in quality on the same factors that they felt were most important in program selection (r s ranging from.15 to.37). Conclusions: PsyD students rated a program s structure, tone, and reputation as particularly important factors to consider in selecting a program. Students quality ratings were used to determine the top 5 programs for each of the factors assessed in the study. C 2012 Wiley Periodicals, Inc. J. Clin. Psychol 68:279 291, 2012. Keywords: clinical psychology graduate training; program rankings; student attitudes; psyd; doctor of psychology; reasons for selection of training program Applicants applying to doctor of psychology (PsyD) programs in clinical psychology are faced with increasing training options (Norcross, Kohout, & Wicherski, 2005), varying on such dimensions as average time to degree completion, acceptance rate, type of degree, theoretical orientation, financial assistance, training model, and university affiliation (Norcross, Ellis, & Sayette, 2010). Guidebooks, rankings, advisors, and websites provide considerable data. However, this information might be lacking, inaccurate, or difficult for applicants to integrate (Burgess, Keeley, & Blashfield, 2008; Hunter, Delgado-Romero, & Stewart, 2009; Landrum, 2010; Norcross et al., 2010; Reynolds, Sargeant, Rooney, Tashiro, & Lejuez, 2008). Integrating all of the information is difficult for both PhD and PsyD applicants (Reynolds et al., 2008); however, it is particularly challenging for PsyD applicants, likely because of PsyD programs relative youth and recent growth (Norcross, Castle, Sayette, & Mayne, 2004). Although they have eclipsed PhD programs in number of graduates, there are less available data on PsyD programs compared with data on PhD programs (Norcross et al., 2004; Norcross et al., 2005; Norcross et al., 2010). Moreover, the available information fails to reflect the heterogeneity among PsyD programs, often collapsing across types of PsyD programs without considering institutional setting (Baker, McFall, & Shoham, 2009; Graham & Kim, 2011; Norcross et al., 2004; Norcross et al., 2010). Without relevant, accurate information, prospective PsyD students are likely uncertain where to apply; similarly, without compatible students able to best utilize their training resources, clinical programs might suffer as their students might not fit with programs strengths, weaknesses, and training mission (Fauber, 2006). Sources of information on PsyD programs vary on the extent to which they summarize or integrate program data, or leave this task to the applicant. Guidebooks, the most commonly utilized informational source of prospective students (McIlvried, Wall, Kohout, Keys, & Goreczny, 2010), exemplify the summative approach. However, they typically provide the same information for PhD and PsyD programs, making some information irrelevant to prospective PsyD students This article is based on a doctoral dissertation by Mitchell Dornfeld. Mitchell D. Dornfeld is now at Morrison Child and Family Services, Gresham, Oregon. Correspondence concerning this article should be addressed to: Mitchell Dornfeld, Morrison Child and Family Services, 2951 NW Division, Suite 200, Gresham, OR 97030; e-mail: Mitchell.Dornfeld@gmail.com JOURNAL OF CLINICAL PSYCHOLOGY, Vol. 68(3), 279 291 (2012) Published online in Wiley Online Library (wileyonlinelibrary.com/journal/jclp). C 2012 Wiley Periodicals, Inc. DOI: 10.1002/jclp.20864

280 Journal of Clinical Psychology, March 2012 who often are choosing a professional degree because of its differences from a PhD program (McIlvried et al.; Norcross & Castle, 2002). Moreover, these printed compendiums do not assess subjective aspects of program quality and fail to capitalize on students current data gathering methods (Hunter et al., 2009). Nonconsolidated sources (i.e., websites, message forums, blogs) provide considerable data from current students perspectives and are increasingly used by applicants (Fauber, 2006). Information from current students might be especially valued as current students are seen as similar to the applicant and are in the position the applicant desires to obtain (Suls, Martin, & Wheeler, 2000). However, these sources can potentially cause cognitive overload (Galotti, 2001, p. 284) because they often require the applicant to infer if the information is sufficient, important, and reliable. Additionally, although the American Psychological Association (APA) requires that accredited programs publish specific data on their websites, recent studies have questioned the extent to which they provide the information students want (Hunter et al., 2009) or are accurate in what they present (Burgess et al., 2008). For example, several program websites have listed their internship match rates at a higher percentage than the Association of Psychology Postdoctoral and Internship Centers (APPIC) reported (Burgess et al.) and many fail to include factors important in the applicant s decision-making process, such as average Graduate Record Examinations (GRE) scores and cost of tuition (Hunter et al.). Program rankings integrate program information and include an overt evaluative component that is absent from guidebooks. Although the extent to which they play a role in PsyD program selection is unclear, undergraduate rankings have been shown to influence college selection (Griffith & Rask, 2007; Meredith, 2004; Monks & Ehrenberg, 1999), which is the model of admissions with which most graduate applicants start. One of the best-known educational ranking systems, U.S. News & World Report s, assesses psychology graduate programs every 5 years and bases its rankings exclusively on the results of peer assessment surveys sent to academics in each discipline (U.S. News and World Report, 2008, 1). This reliance upon reputation as perceived by academic psychologists leaves the U.S. News & World Report system vulnerable to bias in its ranking of PsyD programs, given the majority of doctoral faculty still residing within traditional PhD departments (Kohout & Wicherski, 2010). Also problematic is that U.S. News & World Report provides a single global evaluation, or ranking, of programs. Graduate programs, like students, have their own profiles of strengths and weaknesses. However, there have been a limited number of studies examining the specific factors that students feel is important to consider in PsyD program selection. Walfish, Stenmark, Shealy, and Shealy s (1989) study of first-year doctoral students in APA-accredited clinical programs reported goodness of fit, amount of clinical supervision, and emotional atmosphere as the most important items in choosing a graduate. However, Walfish et al. s sample comprised both PhD and PsyD students, groups that McIlvried et al. (2010) suggested would differ on what they feel is important. A 1998 APA survey of 449 professional psychology students identified goodness of fit, APA accreditation, curriculum, location, and reputation as important (McIlvried et al.) and a survey of graduate students in industrial-organizational (I-O) psychology identified class size, faculty involvement and support of students, program culture, quality of instruction, and faculty research interests as the most important aspects to consider (Kraiger & Abalos, 2004). Although dated and assessing different samples, the results suggest that goodness of fit, program culture/supportiveness, and instruction/supervision as being important in program selection. The purpose of this study was to investigate directors of clinical training (DCTs) and current PsyD students in APA-accredited programs as to what are the important aspects to consider in PsyD program selection and then obtain current students perceptions of the quality of their program on those dimensions. Participants Method This study comprised two groups: currently enrolled PsyD students in clinical psychology and DCTs. Participation was solicited from the DCTs and students in each of the 58 APA-accredited PsyD programs in clinical psychology as of September 2006.

PsyD Program Selection 281 PsyD Students A total of 394 PsyD students representing 21 different programs provided surveys with less than two responses missing in each section. More than half (56.1%) of these students identified themselves as between 25 and 34 years of age. Individuals between 18 and 24 years of age comprised the next largest category (29.2%), with only 13.7% of respondents being older than 35 years of age. The majority of the students identified themselves as Caucasian (88.8%), female (77.9%), enrolled as full-time students (94.9%), and attending a university-affiliated program (87.5%). Approximately one-fifth (19.8%) were first-year students, 25.9% second-year, 22.3% third-year, 16.5% fourth-year, and 10.9% were fifth-year students. Only 3.0% were in their sixth year or beyond. The most commonly anticipated future employment settings were independent practice (27.4%) and hospitals (23.6%). DCTs. Of the 58 DCTs surveyed, 17 (29.3%) completed the survey. Demographically, DCT respondents were dispersed relatively evenly across age groupings (35 44, 45 54, and 55 years) and gender. In terms of their own educational background, 11 of the DCTs had a PhD degree, five had a Psy.D, and one had a PsyD/PhD With regard to their current institutional setting, all DCTs reported being from university-affiliated programs, with six housed in departments of psychology and 11 in university-based professional schools. No DCTs from freestanding schools responded to the survey. Measures Survey of PsyD Programs-Graduate Students (SPP-GS). The SPP-GS is a 45-item, self-report survey comprising three subsections. The first section (18 items) assesses the importance students place on a variety of factors when choosing a clinical PsyD program. Items were partially modeled after Kraiger and Abalos s (2004) list of factors that I-O psychology graduate students considered in ranking their graduate programs. Specifically, items from Kraiger and Abalos measure that assessed features that clinical and I-O psychology programs share (quality of incoming students, faculty-student interactions, stability of the faculty, quality of instruction, program culture, applied-academic balance, program demands, availability of educational resources, connection with the surrounding professional community, completion rate, career placement services, location, and funding) were included. The phrasing of some items was modified. Four additional items, related to externships, reputation, quality of program graduates, and usefulness of training, were also added. Each item contained the name of the item, with accompanying definitions and examples (e.g., demands of the program: number of credits required for degree completion, amount of clinical hours required prior to internship, and presence of a research project). A variant of each of the 18 items had appeared in one or more of the prior studies on items considered for program selection (Kraiger & Alabos, 2004; McIlvried et al., 2010; Walfish et al., 1989), with the exception of the aforementioned demands of the program item. For each item, students were asked to rate, on a 5-point Likert-type scale ranging from 1(not important) to 5 (extremely important), how important it was in their decision to attend their graduate program. Additionally, students were asked to identify if there were additional considerations in their program selection and, if so, to rate those items importance. Internal consistency for the 18 student importance ratings was measured via Cronbach s alpha (a =.86). Participants were then informed that the second section of the measure was designed to collect perceptions of the quality of graduate education from the perspective of students. Participants were presented with the same 18 items as were in the first section, but here they were asked to rate the quality of each of these items at their program on a 5 point Likert-type scale, ranging from 1 (poor)to5(excellent). Internal consistency for student quality ratings was measured via Cronbach s alpha (a =.88). The third section assessed participant demographic and program information (program name, year, full-time/part-time status, preferred future employment setting).

282 Journal of Clinical Psychology, March 2012 Survey of PsyD Programs-Directors of Clinical Training (SPP-DCT). The SPP- DCT is a 23-item, self-report survey for DCTs that comprises two sections. The first section is identical in format to the first section of the SPP-GS, except instead of asking respondents about their own graduate school decisions, they were asked to rate how important each item should be in a student s decision to attend a PsyD program. Internal consistency for the DCT importance ratings was measured via Cronbach s alpha (a =.88). The second section queried about DCT demographic and the institutional setting information. Unlike the SPP-GS, the SPP-DCT did not ask for quality ratings. Procedure Clinical PsyD programs accredited by the APA as of September 2006 were selected for inclusion. This represents 93.4% of the currently APA-accredited PsyD programs (APA, 2011). DCTs were asked via e-mail to complete the SPP-DCT and to forward to their students a link to the SPP-GS. One month after initial contact, a reminder e-mail was sent to schools with no responses. The survey was available for completion from September to December 2006. All participants were informed that the students responses would be linked to their particular graduate program. The consent forms clarified that if less than 10% of the students from a particular program responded, then data regarding students perceptions of program quality would not be reported. Representativeness of the Sample Results Responses were obtained from students representing 21 of the 58 (36.2%) solicited PsyD programs. With the exception of Puerto Rico, there were approximately an equal number of programs from each U.S. Census Bureau geographical region. Over half (52.4%) of the participating programs comprised less than 100 students, with only 14.3% of the programs having over 200 students. In terms of affiliation, 11 (52.4%) of the programs were housed in departments of psychology and seven (33.3%) were in university-based professional schools; these numbers represented over one half of the programs in each of those categories. Twenty-two freestanding programs were contacted for participation in this study, but students from only three of the programs responded, none of which met the 10% threshold. Thus, the results of this study best represent the views of academically housed PsyD programs. Importance Ratings Both students and DCTs reported that 17 of the 18 factors were at least moderately important (rating of 3 or above). Students importance ratings fell between 2.93 and 4.53, with DCTs between 2.59 and 4.71. Students noted several factors as being very important (rating of 4; see Table 1). These higher ranked factors related to the program s structure (program s usefulness, balance between applied and academic emphasis, quality of instruction, externship opportunities), the program s atmosphere or tone (culture of program, quality of faculty-student interaction), and its reputation (reputation, quality of program graduates). Among these, the highest rated individual factor was program usefulness (mean [M] = 4.54, standard deviation [SD] = 0.68). To these factors, DCTs also added as very important (mean rating of at least 4) being connected to the larger professional field (connection with psychological community) and factors that appeared to be related to successful program completion (completion rate, demands of program, and available resources). The highest rated factor among the DCTs was quality of instruction (M = 4.71, SD = 0.47). In terms of less valued dimensions, both students and DCTs ranked career enhancement activities (ranked 15 th ) in the bottom third. This category was defined primarily in terms of research opportunities with faculty and opportunities to attend conferences and workshops. Also, less valued by both respondents was faculty stability or turnover (ranked 16 th ). Career

PsyD Program Selection 283 Table 1 Mean Ratings and Ranks of the Importance of Individual Factors in One s Decision to Attend a PsyD Program Graduate Students. n = 394 DCTs (n = 17) M SD Rank M SD Rank p Program s usefulness 4.54 0.68 1 4.41 0.71 5/6.437 Quality of instruction a 4.41 0.84 2 4.71 0.47 1.023 Quality of faculty-student 4.35 0.88 3 4.59 0.51 3.276 interaction b Balance between applied 4.33 0.90 4 4.65 0.49 2.022 and academic emphasis Culture of program 4.28 0.87 5 4.29 0.85 7.964 Quality of program graduates 4.27 1.00 6 4.47 0.62 4.421 Externship opportunities 4.20 1.02 7 4.41 0.51 5/6.127 Reputation 4.13 0.90 8 4.00 1.06 12/13.571 Completion rate 3.87 1.10 9 4.06 0.75 9/10/11.321 Demands of program a 3.66 1.02 10 4.18 0.73 8.039 Connection with psychology 3.65 1.03 11 4.06 0.56 9/10/11.010 community Available resources 3.57 1.04 12/13 4.06 0.83 9/10/11.054 Location 3.57 1.26 12/13 3.12 0.78 17.035 Quality of new students a 3.56 1.03 14 3.94 0.66 14.036 Career enhancement 3.32 1.13 15 3.76 0.83 15.046 activities Stability of faculty 3.26 1.14 16 3.71 0.85 16.112 Available funding a 3.15 1.25 17 4.00 0.82 12/13.001 Career assistance 2.93 1.09 18 2.59 0.80 18.199 Note: DCT = directors of clinical training; M = mean; SD = standard deviation. a Student n = 393. b Student n = 392. assistance, involving formal job placement services and informal networking, was defined as the least helpful of the dimensions, by both students and DCTs. Students ranked available funding as 17 th of the 18 factors. It is important to note that only career assistance received a mean rating below 3 or moderately important. A total of 13.8% of students included an additional other factor as being important in program selection, with the most common being religious emphasis (n = 17), theoretical orientation (n = 11), and diversity of students/faculty (n = 9). Similarities Between Students and DCTs Perceptions Separate t tests, with a Bonferroni corrected alpha of.0028, were used to determine if students and DCTs rating of a factor s importance differed. The two groups showed considerable similarity, with only one factor (available funding) rated differently by students (M = 3.15, SD = 1.25) as opposed to DCTs (M = 4.00, SD = 0.82), t(17.98) = -4.00, p =.001, d = 0.80. When rank ordered, DCTs and students placed 73.7% of the factors within two ranks of each other. Ratings of Program Quality Students rated their program as performing in the good range (mean quality rating of 4 or above) in 9 of the 18 items assessed. With the exception of available funding (M = 2.43, SD = 1.25), all other were rated as being at least satisfactory (mean quality rating of at least a 3).

284 Journal of Clinical Psychology, March 2012 Table 2 Mean Quality Rating of Each Factor and its Correlation With the Importance Rating of the Same Factor (n= 394) Item M SD r p Balance between Applied and Academic Emphasis 4.37 0.81.25 <.001 Quality of Program Graduates d 4.35 0.76.23 <.001 Program s Usefulness 4.34 0.78.16.001 Demands of Program a 4.28 0.67.12.020 Completion Rate c 4.28 0.72.15.002 Externship Opportunities 4.25 0.85.36 <.001 Reputation 4.23 0.77.27 <.001 Connection with Psychology Community 4.14 0.85.18 <.001 Quality of Instruction 4.13 0.87.18 <.001 Location 3.99 1.00.29 <.001 Quality of Faculty-Student Interaction 3.97 1.16.24 <.001 Quality of New Students 3.85 0.92.34 <.001 Stability of Faculty f 3.78 1.05.13.008 Career Enhancement Activities b 3.71 0.99.24 <.001 Culture of Program 3.70 1.20.35 <.001 Available Resources 3.49 1.11.37 <.001 Career Assistance g 3.15 1.04.33 <.001 Available Funding e 2.43 1.25.20 <.001 Note: M = mean; SD = standard deviation. a n = 393. b n = 392. c n = 390. d n = 387. e n = 386. f n = 381. g n = 361. Variables Associated With Quality Ratings There was a significant positive correlation between a factor s importance and its quality ratings. Results of all but 2 of the 18 Pearson correlations, with a Bonferroni corrected alpha of.0028, indicated a significant positive correlation between the students importance and quality ratings. The two exceptions, demands of the program (r =.12, p =.020) and stability of faculty (r =.13, p =.008), showed trends in the same direction as the other factors (see Table 2). There was a significant difference in the overall mean quality rating of programs when analyzed by institutional setting (department of psychology, university-based professional school, freestanding school), with freestanding programs (M = 3.63, SD = 0.52) being rated lower than university-based professional schools (M = 3.92, SD = 0.56) and programs housed in departments of psychology (M = 4.00, SD = 0.50), F(2, 386) = 8.36, p =.000, h 2 = 0.042. However, a potential confound existed between type of program and year in school as respondents from freestanding schools were significantly more likely to be further along in the program (3.3 years) than those in university-based professional schools (2.6 years) and departments of psychology (2.7 years), F(2, 377) = 10.76, p =.001, h 2 =.094. To control for year in school, a forced entry regression was conducted to gauge the unique variance that institutional setting had on overall mean quality of program. Institutional setting was dummy coded with freestanding programs being the reference category. Year of program significantly predicted students overall mean quality rating, F(1,378) = 32.16, p <.001 with an adjusted R 2 =.076. Institutional setting added to this prediction, adjusted R 2 =.106, R 2 =.03, F(3,376) = 15.93, p <.001. The contrast comparing those programs was a significant predictor, t(376) = 3.53, p <.001, B =.32, β =.28, 95% confidence interval [CI],.14-.50.

PsyD Program Selection 285 Rankings To obtain a ranking of the quality of each item per program, a mean value was calculated for each factor in terms of quality based on the responses of all students in that program. These results were then rank-ordered with the top five schools on each factor being reported (see Table 3). If a program does not appear on a list, it could mean that that program was not in the top five schools for that particular factor, the program did not participate in the survey, or there was not a response rate of 10% of that program s students based on the program size as listed in the APA s Graduate Study in Psychology (2006). Discussion Each year, thousands of potential future clinical PsyD students attempt to choose the best graduate program to attend. The best graduate program for that student is likely the one that is the best fit (Fauber, 2006; McIlvried et al., 2010; Reynolds et al., 2008; Walfish et al., 1989). However, to determine goodness of fit, the prospective student needs to have a clear understanding of what the program is truly like, as well as an appreciation of what dimensions are important to consider when evaluating clinical PsyD programs. Obtaining accurate, meaningful information on what programs are like can be challenging. Global rankings or terse profiles often neglect to include viewpoints of current students or other information the applicant considers meaningful. To address these concerns, applicants often access sources on the Internet to learn about a program, but they might question if such information is biased (Burgess et al., 2008). Moreover, such sources are often less organized, making it difficult to sift through and integrate the data, particularly as prospective students might not even be sure what types of information are relevant to consider. Students Views Students reported many factors (17 of 18) as moderately important in their own program selection process. Visual inspection of students ratings of importance suggested some loose groupings among the factors. The students rated eight factors as higher than very important (range of mean values = 4.13 4.54). These eight factors appeared to reflect issues related to the program s structure (program s usefulness, balance between applied and academic emphasis, quality of instruction, and externship opportunities), the tone of the program (culture of program and quality of faculty-student interaction), and its reputation (reputation and quality of program graduates). A program s structure defines what a student will gain from the program. It appeared that the clinical PsyD students prioritize a program that will provide a balanced educational environment, yet one that emphasizes applied aspects. They value high-quality instruction both within the classroom and in their clinical practicum. This finding is consistent with prior research (Kraiger & Alabos, 2004; McIlvried et al., 2010, Walfish et al., 1998) in graduate psychology program selection. Program tone represents the general atmosphere of the setting. Also consistent with prior findings, students appeared to desire high-quality, supportive interactions with the faculty in an environment that was flexible and not overwhelming (Kraiger & Alabos, 2004; Walfish et al.). The final element was one of program reputation, or how the training is viewed by the larger psychological community, including the extent to which completion of the program facilitates the student successfully attaining future professional goals. This variable had not been identified as a highly important item in prior studies. Less important to this sample of PsyD students were two factors likely more salient to PhD students (career enhancement activities and stability within the faculty) and two factors that students rated as low in quality at their current PsyD programs (career assistance and available funding). With respect to the career enhancement activities and stability within the faculty, these findings suggest although research mentorship is central to the PhD model, it is less applicable for PsyD students, indicating the need to differentiate PhD and PsyD students needs.

286 Journal of Clinical Psychology, March 2012 Table 3 Ranking of the Top Five Graduate Programs on Each Factor Based on Students Ratings of Program Quality M SD n(% of students in program) Program s usefulness 1. Marshall University 4.75 0.50 4 (16.0%) 2. Loyola College 4.67 0.48 30 (43.5%) 3. Rutgers University 4.46 0.69 28 (24.8%) 4. University of Denver 4.45 0.63 29 (20.1%) 5. George Fox University (tie) 4.40 0.87 25 (28.1%) 5. Roosevelt University (tie) 4.40 0.70 10 (18.2%) Quality of instruction 1. Marshall University 4.50 1.00 4 (16.0%) 2. Long Island University C.W. Post Campus 4.46 0.78 13 (15.1%) 3. Loyola College 4.40 0.68 30 (43.5%) 4. Rutgers University 4.36 0.62 28 (24.8%) 5. Chestnut Hill College (tie) 4.35 0.75 20 (24.1%) 5. Spalding University (tie) 4.35 0.63 26 (20.0%) Quality of faculty-student interaction 1. Marshall University 4.75 0.50 4 (16.0%) 2. Loyola College 4.63 0.89 30 (43.5%) 3. Roosevelt University 4.50 0.71 10 (18.2%) 4. George Fox University 4.48 0.92 25 (28.1%) 5. Indiana State University 4.44 0.73 9 (18.0%) Balance between applied and academic emphasis 1. Loyola College 4.67 0.61 30 (43.5%) 2. Indiana State University 4.56 0.53 9 (18.0%) 3. Azusa Pacific University 4.53 0.63 15 (24.2%) 4. George Fox University 4.52 0.65 25 (28.1%) 5. Marshall University (tie) 4.50 1.00 4 (16.0%) 5. Rutgers University (tie) 4.50 0.64 28 (24.8%) Culture of program 1. Marshall University 5.00 0.00 4 (16.0%) 2. Indiana State University 4.56 0.73 9 (18.0%) 3. George Fox University 4.40 0.58 25 (28.1%) 4. Loyola College 4.13 1.04 30 (43.5%) 5. Rutgers University 4.07 0.77 28 (24.8%) Quality of program graduates 1. Indiana State University 5.00 0.00 9 (18.0%) 2. Loyola College 4.89 0.32 28 (40.1%) 3. Baylor University 4.83 0.30 12 (40.0%) 4. Rutgers University 4.75 0.44 28 (24.8%) 5. Long Island University C.W. post campus 4.62 0.51 13 (15.1%) Externship opportunities 1. Rutgers University 4.71 0.46 28 (24.8%) 2. Loyola College 4.67 0.48 30 (43.5%) 3. Long Island University C.W. post campus 4.54 0.78 13 (15.1%) 4. Azusa Pacific University 4.53 0.64 15 (24.2%) 5. Baylor University (tie) 4.50 0.67 12 (40.0%) 5. Yeshiva University (tie) 4.50 0.71 26 (19.1%) Reputation 1. Rutgers University 4.86 0.37 28 (24.8%) 2. Loyola College 4.70 0.47 30 (43.5%) 3. Baylor University 4.67 0.49 12 (40.0%) 4. University of Denver 4.45 0.57 29 (20.1%) 5. Indiana State University 4.44 0.53 9 (18.0%)

PsyD Program Selection 287 Table 3 Continued M SD n(% of students in program) Completion rate 1. Baylor University (tie) 4.67 0.49 12 (40.0%) 1. Indiana State University (tie) 4.67 0.50 9 (18.0%) 1. Loyola College (tie) 4.67 0.48 30 (43.5%) 4. George Fox University 4.56 0.51 25 (28.1%) 5. University of Denver 4.55 0.63 29 (20.1%) Demands of program 1. Loyola College 4.67 0.48 30 (43.5%) 2. Baylor University (tie) 4.50 0.52 12 (40.0%) 2. Marshall University (tie) 4.50 0.58 4 (16.0%) 2. Roosevelt University (tie) 4.50 0.71 10 (18.2%) 5. Azusa Pacific University 4.47 0.52 15 (24.2%) Connection with the psychology community 1. Rutgers University 4.61 0.63 28 (24.8%) 2. Loyola College 4.47 0.68 30 (43.5%) 3. George Fox University 4.44 0.71 25 (28.1%) 4. Azusa Pacific University 4.33 0.82 15 (24.2%) 5. Philadelphia College of Osteopathic Medicine 4.32 1.00 19 (14.6%) Available resources 1. Indiana State University 4.56 0.53 9 (18.0%) 2. Baylor University 4.25 0.74 12 (40.0%) 3. Loyola College 4.23 0.90 30 (43.5%) 4. Rutgers University 4.21 0.63 28 (24.8%) 5. Philadelphia College of Osteopathic Medicine 4.05 1.27 19 (14.6%) Location 1. Pacific University 4.63 0.79 38 (18.2%) 2. University of Denver 4.55 0.63 29 (20.1%) 3. University of Indianapolis 4.48 0.65 25 (26.9%) 4. Roosevelt University 4.30 0.95 10 (18.2%) 5. Marshall University 4.25 0.98 4 (16.0%) Quality of new students 1. Rutgers University 4.75 0.59 28 (24.8%) 2. Indiana State University 4.56 0.53 9 (18.0%) 3. Baylor University 4.50 0.67 12 (40.0%) 4. Roosevelt University 4.40 0.52 10 (18.2%) 5. Azusa Pacific University (tie) 4.13 0.74 15 (24.2%) 5. Loyola College (tie) 4.13 0.73 30 (43.5%) Career enhancement activities 1. Azusa Pacific University 4.43 0.85 14 (22.6%) 2. George Fox University 4.40 0.82 25 (28.1%) 3. Philadelphia College of Osteopathic Medicine 4.37 1.01 19 (14.6%) 4. Long Island University C.W. post campus 4.00 0.71 13 (15.1%) 5. Loyola College 3.97 0.77 30 (43.5%) Stability of faculty 1. Roosevelt University 4.50 0.71 10 (18.2%) 2. George Fox University 4.40 0.71 25 (28.1%) 3. Loyola College 4.24 0.79 29 (42.0%) 4. Rutgers University 4.15 0.77 27 (23.9%) 5. Indiana State University 4.11 0.78 9 (18.0%) Available funding 1. Indiana State University 4.56 0.53 9 (18.0%) 2. Baylor University 4.17 1.03 12 (40.0%) 3. Long Island University C.W. post campus 3.54 1.33 13 (15.1%) 4. Marshall University 3.25 1.71 4 (16.0%) 5. Azusa Pacific University 3.07 1.22 15 (24.2%)

288 Journal of Clinical Psychology, March 2012 Table 3 Continued M SD n (% of students in program) Career assistance 1. Indiana State University 3.78 0.67 9 (18.0%) 2. Rutgers University 3.70 0.83 27 (23.9%) 3. Long Island University C.W. post campus 3.46 0.88 13 (15.1%) 4. Loyola College 3.38 0.85 26 (37.7%) 5. George Fox University 3.35 0.83 23 (25.8%) Note: M = mean; SD = standard deviation. The low importance the surveyed PsyD students assigned to available funding might have simply reflected either resignation to the financial reality of pursuing PsyD training or an incomplete appreciation of the implications of beginning their career with high levels of educational debt. Less than 3% of PsyD students use university sources as their primary means of financial support, whereas 57% use loans and 35% use their own resources (Wicherski & Kohout, 2007). Additionally, in 2005, PsyD graduates reported an average $100,000 of debt. The relatively low rating career assistance received might reflect the preponderance of responses from students in the initial years of their graduate school experience who might not yet be focused on postgraduation employment issues. DCTs Views The importance ratings provided by the university-affiliated DCTs who responded were similar to those of the students, significantly differing on only one factor (available funding). DCTs greater emphasis on this factor might have represented a greater awareness the debt-earning challenge awaiting their PsyD graduates, as well as an appreciation for how funding affects broader program issues (e.g., recruitment, retention, student satisfaction). Perceptions of Quality Although previous ranking guides have been published, they have been limited in their focus on providing a single global ranking of quality. This study showed that clinical PsyD students believe that there are many important aspects to consider when choosing a graduate school, thus a system that emphasizes a program s pattern of strengths and weaknesses is likely to suit better the needs of prospective students. The majority of students, primarily from academically housed PsyD programs, reported being satisfied with the quality of most aspects at their individual schools, with 9 of the 18 factors achieved a rating of 4 or above. Of note was that, with the exception of two factors (demands of the program and career assistance), there was a significant correlation between the importance that students placed on an item and their quality ratings of that same factor at their own graduate program. This finding has a few possible interpretations. First, importance and quality ratings could be highly correlated because students have been able to seek out the programs that offer the attributes that they deem important. However, given some of the limitations discussed above regarding information sources, as well as the fact that additional factors influence program selection (e.g., student s academic record, GRE scores, number of applicants), it is unlikely this explanation alone can explain the robust association between quality and importance. Another explanation involves the notion of cognitive dissonance. Given that students have invested considerable resources in terms of time, effort, and money in their graduate educations, they might be motivated to view their programs favorably on the variables they have identified as important to justify their expenditures. Or, conversely, they might be motivated to rank the items in which their program excels as most important, thereby confirming that their program is

PsyD Program Selection 289 a high-quality program. Either way, the student would avoid feeling discomfort in his decision to attend his particular graduate program. One question that will need to be further investigated in future research arose from a regression analysis indicating that institutional setting (specifically, programs housed in departments of psychology compared with freestanding ones) was a significant predictor of overall mean quality rating when controlling for year in program. However, this finding must be considered preliminary and interpreted with caution given that so few (n = 3) freestanding PsyD programs had students who participated in the study. Much has been written regarding the potential affect on quality of the higher acceptance rates and larger class sizes in freestanding as opposed to university-affiliated programs (Norcross et al., 2004; Norcross et al., 2010; Templer, Stroup, Mancuso, & Tangen, 2008). Although the findings of this study are not inconsistent with such notions, the very limited sample of freestanding programs participating precludes any definite conclusions based on data from this study. Limitations of the Study As described above, a limitation of the current study was the representativeness of the sample. Because all of the survey invitations were sent through each program s DCT, the number of students who received an invitation to participate in the survey was unknown. Overall responses were received from students in 36.2% of the programs solicited; McIlvried et al. s (2010) survey, conducted under the auspices of the APA Research Office, had a 41% response rate of programs (McIlvried et al.). Nevertheless, participation varied by program type with minimal participation among freestanding PsyD programs. Given that recent literature assessing the quality of PsyD programs at times fails to differentiate students in university-affiliated PsyD programs from those in freestanding PsyD programs (Baker et al. 2009; Cassin, Singer, Dobson, & Altmaier, 2007; Perry & Boccaccini, 2009; Rosen & Oakland, 2008), despite evidence that these groups differ on a number of important variables (Norcross et al., 2010), such an emphasis might in fact prove helpful in shedding light on the less populated university-affiliated PsyD programs. This study sought the opinions of currently enrolled PsyD students and, to a more limited extent, DCTs. It was thought that because current students had recently completed the admissions process, they would be able to evoke their mindset as applicants while also being able to provide quality ratings of the program. However, opinions regarding the value of different aspects might change as the individual transitions from applicant to PsyD student to alumnus. Another limitation arose from the list of items that formed the basis for the importance and quality rating. However, any bias in item selection was minimized by choosing all but one of the items from prior research (Kraiger & Alabos, 2004; McIlvried et al., 2010; Walfish et al., 1989). Nevertheless, it was not an exhaustive list as illustrated by 13.8% of respondents writing in an other items to be considered. Finally, the quality ranking system used only subjective perceptions of one informant to create the rankings. Given the concerns raised about the role cognitive dissonance might play in student perceptions, a system based on a mix of objective and subjective data from multiple informants would be optimal. Implications Despite this study s limitations, it is unique in the relative recency of its data and in its focus on clinical PsyD students views regarding program selection. Most of the literature on clinical program selection is limited and dated; therefore, impeding the extent to which it can be used by either prospective students or PsyD programs in understanding the admissions process. The current study identifies factors for applicants to consider when choosing a graduate program. Once prospective students recognize the attributes that are important to them, they can utilize the individual quality item rankings to hone in on the programs that clearly match their graduate school goals. Programs too can benefit from the findings. A heightened awareness of the aspects that prospective students value can assist them in recruitment, curricular, and resource decisions.

290 Journal of Clinical Psychology, March 2012 Quality rankings can be one element in self-evaluation to assist in program improvement and refined. Last, with clearer communication regarding programs strengths, improved matches are likely to occur, leading to better retention and successful program completion. Finally, further studies could also include objective measures in combination with subjective assessment of the type assessed in this research. Data such as GRE scores, internship match rates, and Examination for Professional Practice in Psychology (EPPP) pass rates could be used in conjunction with the students responses to further gauge the quality of the program. Much of the objective data are currently available in other locations; however, combining the information into one location would assist future applicants and administrators in gathering all important data. References American Psychological Association. (2006). Graduate study in psychology 2006. Washington, DC: Author. American Psychological Association. (2011). Accredited programs in clinical psychology. Retrieved from http://www.apa.org/ed/accreditation/programs/clinical.aspx Baker, T.B., McFall, R.M., & Shoham, V. (2009). Current status and future prospects of clinical psychology: Towards a scientifically principled approach to mental and behavioral health care. Psychological Science in the Public Interest, 9, 67 103. doi:10.1111/j.1539-6053.2009.01036.x Burgess, D., Keeley, J., & Blashfield, R. (2008). Full disclosure data on clinical psychology doctorate programs. Training and Education in Professional Psychology, 2, 117 122. doi:10.1037/1931-3918.2.2.117 Cassin, S.E., Singer, A.R., Dobson, K.S., & Altmaier, E.M. (2007). Professional interests and career aspirations of graduate students in professional psychology: An exploratory survey. Training and Education in Professional Psychology, 1, 26 37. doi:10.1037/1931-3918.1.1.26 Fauber, R.L. (2006). Graduate admissions in clinical psychology: Observations on the present and thoughts on the future. Clinical Psychology: Science and Practice, 13, 227 234. doi:10.1111/j.1468-2850.2006.00029.x Galotti, K.M. (2001). Helps and hindrances for adolescents making important real-life decisions. Applied Developmental Psychology, 22(3), 275 287. doi:10.1016/s0193-3973(01)00084-3 Graham, J.M., & Kim, Y.H. (2011). Predictors of doctoral student success in professional psychology: Characteristics of students, programs, and universities. Journal of Clinical Psychology, 67, 340 354. doi:10.1002/jclp.20767 Griffith, A., & Rask, K. (2007). The influence of US News and World Report collegiate rankings on the matriculation decision of high-ability students: 1995 2004. Economics of Education Review, 26, 244 255. doi:10.1016/j.econedurev.2005.11.002 Hunter, G.A., Delgado-Romero, E.A., & Stewart, A.E. (2009). What s on your training program s web site? Observations and recommendations for effective recruitment. Training and Education in Professional Psychology, 3, 53 61. doi:10.1037/a0013825 Kohout, J., & Wicherski, M. (2010). 2010 Graduate study in psychology snapshot: Faculty in U.S. and Canadian graduate departments of psychology: 2008 2009. Retrieved from http://www.apa.org/ workforce/publications/10-grad-study/report-faculty.pdf Kraiger, K., & Abalos, A. (2004). Rankings of graduate programs in I/O psychology based on student ratings of quality. The Industrial Organizational Psychologist, 42, 28 43. Retrieved from http://www.siop. org/tip/backissues/july04/pdf/421_028to043.pdf Landrum, R.E. (2010). Intent to apply to graduate school: Perceptions of senior year psychology majors. North American Journal of Psychology, 12, 243 254. McIlvried, E.J., Wall, J.R., Kohout, J., Keys, S., & Goreczny, A. (2010). Graduate training in clinical psychology: Student perspectives on selecting a program. Training and Education in Professional Psychology, 4, 105 115. doi:10.1037/a0016155 Meredith, M. (2004). Why do universities compete in the ratings game? An empirical analysis of the effects of the U.S. News and World Report college rankings. Research in Higher Education, 45, 443 461. doi:0361-0365/04/0800-0443/0 Monks, J., & Ehrenberg, R.G. (1999). U.S. News and World Report s college rankings. Change, 31(6), 43 51.

PsyD Program Selection 291 Norcross, J.C., & Castle, P.H. (2002). Appreciating the PsyD: The facts. Eye on Psi Chi, 7, 22 26. Retrieved from http://www.psichi.org/pubs/articles/article_171.aspx Norcross, J.C., Castle, P.H., Sayette, M.A., & Mayne, T.J. (2004). The PsyD: Heterogeneity in practitioner training. Professional Psychology: Research and Practice, 35, 412 419. doi:10.1037/0735-7028.35.4.412 Norcross, J.C., Ellis, J.L., & Sayette, M.A. (2010). Getting in and getting money: A comparative analysis of admission standards, acceptance rates, and financial assistance across the research-practice continuum in clinical psychology programs. Training and Education in Professional Psychology, 4, 99 104. doi:10.10037/a0014880 Norcross, J.C., Kohout, J.L., & Wicherski, M. (2005). Graduate study in psychology: 1971 2004. American Psychologist, 60, 959 975. doi:10.1037/0003-066x.60.9.959 Perry, K.M., & Boccaccini, M.T. (2009). Specialized training in APA-accredited clinical psychology doctoral programs: Findings from a review of program websites. Clinical Psychology: Science and Practice, 16, 348 359. doi:10.1111/j.1468-2850.2009.01173.x Reynolds, E.K., Sargeant, M.N., Rooney, M.E., Tashiro, T., & Lejuez, C.W. (2008). Actively assessing fit when applying to PhD programs in clinical/counseling psychology: The applicant s perspective. The Behavior Therapist, 31, 57 61. Available at: http://usc.academia.edu/marshasargeant/ Papers/97436/Actively_Assessing_Fit_When_Applying_to_Ph.D._Programs_in_Clinical_Counseling_ Psychology_The_Applicants_Perspective Rosen, E. & Oakland, T. (2008). Graduate preparation in research methods: The current status of APAaccredited professional programs in psychology. Training and Education in Professional Psychology, 2, 42 49. doi:10.1037/1931-3918.2.1.42 Suls, J., Martin, R., & Wheeler, L. (2000). Three kinds of opinion comparison: The triadic model. Personality & Social Psychology Review, 4(3), 219 237. Templer, D.I., Stroup, K., Mancuso, L.J., & Tangen, K. (2008). Comparative decline of professional school graduates performance on the examination for professional practice in psychology. Psychological Reports, 102, 551 560. doi:10.2466/pro.102.2.551-560 U.S. News and World Report. (2008). Social Sciences & Humanities Methodology. Retrieved from http://www.usnews.com/education/articles/2008/03/26/social-sciences humanities-methodology Walfish, S., Stenmark, D.E., Shealy, J.S., & Shealy, S.E. (1989). Reasons why applicants select clinical psychology graduate programs. Professional Psychology: Research and Practice, 20, 350 354. doi:10.1037/0735-7028.20.5.350 Wicherski, M., & Kohout, J. (2007). 2005 doctorate employment survey. Retrieved from http://www.apa.org/workforce/publications/05-doc-empl/index.aspx