Required for Review: Ask and answer an assessment-focused question



Similar documents
Outcome: Compare and contrast different research methods used by psychologists including their respective advantages and disadvantages.

Master of Education: Educational Psychology with an emphasis in Educational Psychology Online Completion

2006 Assessment Report

An Analysis of IDEA Student Ratings of Instruction in Traditional Versus Online Courses Data

Assessment Findings and Curricular Improvements Department of Psychology Undergraduate Program. Assessment Measures

ACT CAAP Critical Thinking Test Results

Master of Science: Educational Psychology with an emphasis in Information Technology in Education Online Completion

Assessment METHODS What are assessment methods? Why is it important to use multiple methods? What are direct and indirect methods of assessment?

1. Is the program overall effective in preparing candidates to meet the expected outcomes?

M.Ed. COUNSELING AND GUIDANCE School Counseling and Community Counseling

Council for Accreditation of Counseling and Related Educational Programs (CACREP) and The IDEA Student Ratings of Instruction System

Annual Assessment Summary B.S. in Psychology

Program Outcomes and Assessment

Aurora University Master s Degree in Teacher Leadership Program for Life Science. A Summary Evaluation of Year Two. Prepared by Carolyn Kerkla

IDEA. Student Ratings of Instruction Survey Report Fall Guam Community College

Department of Political Science. College of Social Science. Undergraduate Bachelor s Degree in Political Science

Outcomes Data School Psychology EdS Program

Living in the Red Hawks Community

Texas Higher Education Coordinating Board

Assessment of Student Learning Plan (ASLP): ART Program

A Survey Among Members Of The Association Of American Colleges And Universities. Conducted By Hart Research Associates. April 2009

Master of Arts in Psychology

Discuss DIVERSITY AND PROFESSIONAL DISPOSITIONS 1 SECTION I CONTEXT

Bachelor s Degrees. You may earn a maximum of 30 college credits by examination. See page 22 for further information.

Information Technology in Education

California State University, Stanislaus Doctor of Education (Ed.D.), Educational Leadership Assessment Plan

Master of Education Degree in Curriculum and Instruction in Peace Education

High-Impact Practices and Experiences from the Wabash National Study

EXECUTIVE SUMMARY. List all of the program s learning outcomes: (regardless of whether or not they are being assessed this year) Learning Outcome 1

IHE Master's of School Administration Performance Report. Western Carolina University

GEORGIA STANDARDS FOR THE APPROVAL OF PROFESSIONAL EDUCATION UNITS AND EDUCATOR PREPARATION PROGRAMS

Community College Survey of Student Engagement

Savannah State University Academic Program Strategic & Operational Plan. September 2009

UAF-UAA Joint PhD Program in Clinical-Community Psychology with Rural, Indigenous Emphasis Outcomes Assessment Goals, Objectives, and Benchmarks

Cognition, Instruction, & Learning. University of Connecticut. Technology Graduate Program. Department of Educational Psychology.

Master of Arts in Psychology

M.S. in Criminal Justice Program Assessment Plan October 2007

Technical Assistance Manual

Master of Arts in Higher Education (both concentrations)

Program Outcomes and Assessment. Learning Outcomes

UNM Department of Psychology. Bachelor of Science in Psychology Plan for Assessment of Student Learning Outcomes The University of New Mexico

CHILD AND FAMILY STUDIES

CRITICAL THINKING ASSESSMENT

PHD SPECIALIZATION IN HOSPITALITY ADMINISTRATION SCHOOL OF HOTEL AND RESTAURANT ADMINISTRATION COLLEGE OF HUMAN SCIENCES OKLAHOMA STATE UNIVERSITY

Science, Technology, Engineering, and Mathematics (STEM) Education and Professional Studies

ABET - Criteria for Accrediting Engineering Programs Mapped to NSSE Survey Questions

IHE Master's Performance Report

NORTH CAROLINA AGRICULTURAL AND TECHNICAL STATE UNIVERSITY

Annual Key Assessment Findings and Curricular Improvements Report Undergraduate Programs Metropolitan School of Professional Studies AY

Portfolio Guidelines: Practicum Year Northeastern University's CAGS Program in School Psychology* Revised May 2014

OVERVIEW AND ADMISSION REQUIREMENTS

Composition Studies. Graduate Certificate Program

HUMAN DEVELOPMENT MASTER S DEGREE HANDBOOK

UNF-MPA student learning outcomes and program assessment

MEMORANDUM. RE: MPA Program Capstone Assessment Results - CY 2003 & 2004

Department of Accounting, Finance, & Economics

B2aiii. Acquiring knowledge and practicing principles of ethical professional practice.

Northwest University Online

How To Use The College Learning Assessment (Cla) To Plan A College

MA in Sociology. Assessment Plan*

Analyses of the Critical Thinking Assessment Test Results

Master of Science in Education Teacher Education

California State University, Los Angeles College Portrait. The Cal State LA Community. Carnegie Classification of Institutional Characteristics

Student Union B, Room 100 (501) Professional and

NATIONAL CATHOLIC SCHOOL OF SOCIAL SERVICE Baccalaureate Study in Social Work Goals and Assessment of Student Learning Outcomes

Southern Arkansas University Gifted & Talented Preparation Program Handbook Dr. Carla Bryant, Director

2009 MASTER PLAN/PROGRESS REPORT

University of Minnesota Catalog. Degree Completion

CSM. Psychology, BS. 3. Students can demonstrate knowledge and understanding in history and systems, research methods, and statistics.

Community College Survey of Student Engagement

Programme Specification and Curriculum Map for MA TESOL

The Influence of a Summer Bridge Program on College Adjustment and Success: The Importance of Early Intervention and Creating a Sense of Community

College of Education. Bachelor of Science Psychology Major and Comprehensive Psychology Program

2014 EPP Annual Report

University of Wisconsin Madison General Information about the Reading Teacher and Reading Specialist Licenses

OKLAHOMA STATE UNIVERSITY School of Applied Health and Educational Psychology

AACSB Business Mapped to NSSE 2015 Survey Items

MASTER OF SCIENCE OCCUPATIONAL THERAPY

Composition Studies. Graduate Certificate Program. Online Certificate Program in English. Indiana University East Department of English

MiraCosta Community College District programs are consistent with the college mission, vision, and core values.

2. SUMMER ADVISEMENT AND ORIENTATION PERIODS FOR NEWLY ADMITTED FRESHMEN AND TRANSFER STUDENTS

Change the requirements for the Management Major and Minor

Transcription:

1. Quality of program outcomes What direct measures of student learning/achievement does the program use: CLA, program assessment (portfolio, common final/question set, licensure exams, exit exam, etc.) What indirect measures of student and faculty satisfaction does the program use: IDEA aggregate data on shared program objectives or Group Summary Reports; NSSE scores, program survey results (if available) Required for Review: Ask and answer an assessment-focused question The psychology program has been attentive to the assessment of student achievement and diligent in using the results. We have used a mixed-method approach to collecting data and have employed both direct and indirect measures to investigate the achievements of our majors. We have focused our attention on important learning outcomes that were recommended by the APA in the 2007 guidelines. Our direct measures were chosen to address learning outcomes under goals 2 and 3 of the APA guidelines, Research Methods in Psychology and Critical Thinking in Psychology. One measure of Information and Technological Literacy (Goal 6, APA) was administered in 2006 and is due to be repeated in 2011. In addition to the interpretive analysis of course-based assessment we have used the following instruments. The Collegiate Learning Assessment (CLA): A national standardized measures of value-added, measures student achievement in writing, critical thinking, and analytic reasoning. Thirty-seven Psychology majors took the CLA in the 2007-2008 academic year. The mean CLA scores for Psychology seniors at Stockton on the performance task was 1113 (SD, 205) on the analytic writing task 1091 (SD, 143). For the analytic writing four students were at the expected level of achievement, four were above/well above, and seven were below/well below. For the make an argument task, two students were at the expected level, seven were above/well above and six were below/well below. Two students were at the expected level of performance in the critique an argument task, four were well above, and nine were below/well below. In the 2009 2010 academic year 24 Psychology seniors took the CLA. The average GPA for these students was 3.17, with a range from 2.14 to 4.0. Nationally, the mean CLA score of seniors across schools that took the CLA: performance task=1156, analytic writing=1226 (2010 data). At Stockton, the mean CLA scores of seniors at Stockton were: performance task=1130, analytic writing=1166 and mean CLA scores of psychology seniors at Stockton: performance task=1090/analytic writing=1194. The samples were small in both cases but the results were very similar. On both the performance task and analytic writing tasks, psychology seniors, like all Stockton seniors, performed below the national mean. On the writing task, psychology seniors in the sample of students that was tested had means that were slightly higher than Stockton seniors overall and on the performance task, slightly lower. However, our overall student data has such high variance (as does the program's student data (standard deviation of 176.3 for performance task and 126.6 for analytic writing), that this is not particularly meaningful. In both administrations of the CLA fewer than half of the seniors were native Stockton students. On average the performance of students in the sample was just below the expected level for their GPA and entering qualifications. In response to the initial unsatisfactory results psychology faculty participated in the college s spring assessment institute and they developed a new performance tasks for use in instruction and

measurement. This new task Drunk Driving Reduction Task has been made available to college faculty nationwide on the CLA in the Classroom web site and psychology faculty have used it for instruction and for formative assessment of our majors. Additionally, 90 students completed the retired CLA Crime Reduction performance task. This authentic assessment mirrors the type of skills and abilities that are measured by the CLA. We assessed both student achievement and student learning in critical thinking using the retired CLA Crime Reduction Task. This is a 90-minute interpretive exercise in which students evaluate seven different documents and draw appropriate conclusions, suggest alternative explanations for relationships between variables, and show some organization in their writing. The standardized scoring rubric for the task was developed by the CLA. Central to a correct analysis is the students clarity of the difference between correlation and causation. All of the students who were tested have completed statistical methods; correlation analysis is covered in this course. Forty-one students were tested at the start and at the end of the semester in which they were enrolled in Experimental Psychology. Students performed better at the end of the semester than they did at the start t (39) = -3.902, p =.000. The mean scores on the pre and the post-tests were 2.942, and 3.600, from a maximum score of 5.000. The post-test score were significantly higher than the pre-test scores and students clearly learned important concepts and skills during the interim after the pre-test. There was much greater variability in the post-test distribution of scores than there was in the pre-test; instruction was not uniformly effective. An additional forty-nine students were tested in a single administration at the start of the semester in which they were enrolled in Experimental Psychology. In this assessment of achievement, the students performed at a level similar to the pre-test levels in the earlier administration of the test. Even with the success of the instructional intervention too many students were willing to take the evidence at face value instead of considering various points of view and alternate explanations; too many students were not adept at reading graphs or charts. Non-Standardized Direct Measures We have tested students achievement in an important outcome for psychology majors, the ability to understand statistical data as reported in research articles. The instrument Reading Statistics and Research was developed by the program in 2003 and has been administered to more than 500 psychology majors over the past seven years. In the last administration, fall 2010, one hundred and twenty nine students were tested. The pre/post design of the direct measure of the comprehension and application of statistical analyses and research design shows that students gain most from the combination of both Statistical Methods and Experimental Psychology. Although there is an expected significant increase in the percentage score of students from pre to post, the average post instructional score of 59% is still 11 percentage points lower than a passing grade. Although here is wide variability in the posttest results as scores range from 97% to 38%, the results are far from satisfactory. To assess students knowledge of research design Dr. Sluzenski created a task to assess students abilities to identify major flaws in experimental design. Over 100 students completed the task online. There was no significant relationship between performance on the task and the number of psychology courses they had taken (r =.13, p >.10). These data suggest that we can improve as a

program in teaching the essentials of experimental design. This assessment will be repeated for reliability and will be modified, based on the results, for task difficulty. Indirect Measures Valuable information from indirect (self-report) measures of engagement, satisfaction and progress was gleaned from the National Survey of Student Engagement (NSSE) and the IDEA. Compared to the general college population at Stockton, psychology majors report high levels of intellectual engagement, higher-level cognitive activities in class, rigorous coursework, and adequate co-curricular participation. Of the 250 majors who completed the NSSE, the vast majority report that they are engaged with the community, their peers, and the faculty who teach them. Twenty five percent of the NSSE sample reported that they had completed an internship, and 46% had done community service or volunteer work. More than 80% of the majors who participated in the survey reported that the institution had contributed quite a bit or very much to their critical thinking and 75% said the institution contributed quite a bit or very much to the development of abilities to solve quantitative analytic problems. Thirty-six graduating psychology majors completed the exit survey and they reported for the most part, that they very satisfied with the educational experience of studying psychology at Stockton. Ninety four percent said they learned a great deal about research methods and 78% rated their quantitative skills development most highly. Sixty-seven and 50% of majors said they learned a great deal about global issues and environmental issues respectively while 97% cited major gains in interpersonal skills. Summary reports from the Individual Developmental and Educational (IDEA) student rating system indicate that students are highly satisfied with the psychology courses and that compared to the other courses at the institution psychology courses are very highly rated by students. As an indication of the progress we are making in addressing the ability of our students to interact effectively with people of diverse abilities, backgrounds, and cultural perspectives, we are reviewing their responses to one question on the NSSE their rating of the institutional emphasis on encouraging contact among students from different backgrounds. Table 9 and Figures 3 and 4 summarize our findings. Table 9: Psychology Majors Self-Report of Intercultural Institutional Emphasis 2005-2008 Institutional emphasis: Encouraging contact among students from different economic, social, and racial or ethnic backgrounds Cumulative Frequency Percent Valid Percent Percent Valid Very little 40 15.9 16.0 16.0 Some 75 29.9 30.0 46.0 Quite a bit 84 33.5 33.6 79.6 Very much 51 20.3 20.4 100.0 Total 250 99.6 100.0 Missing System 1.4 Total 251 100.0

Figure 3: Intercultural Institutional Emphasis by year Figure 4: Institutional Emphasis by class

Just over half of our majors report adequate institutional emphasis on encouraging contact among students who are different one from the other. On balance students who started their college career at Stockton are not different from the ones who started elsewhere and there are more seniors who report very little emphasis than there are freshmen. In figure 3, there appears to be a difference in students perspective of our institutional emphasis on cross-cultural interactions based on whether or not they started their college career at Stockton. This may be related to a focus on intercultural competencies in the freshman year experience that became more entrenched after 2005. In our assessments we have answered several questions. How well are our majors able to understand statistical reports that are published in psychology journals? Are they learning to do this in Experimental Psychology class? We would like to see improvement in the ability of students to understand published statistics but students who take Experimental Psychology are better at it than those who have not yet done so. How well developed are the analytical reasoning abilities of our majors? Are they learning to be better at this in Experimental Psychology class? Students are definitely better at reasoning and critical thinking after they have taken Experimental Psychology than before but we would like to see less diversity in the post-test results and higher means across tasks.

Are students abilities to analyze research designs related to the number of psychology courses that they have taken? This is disappointing; they are not better at design analysis after several courses. Are we emphasizing intercultural exposure enough? We should be more attentive to this outcome in the upperclassmen. IDEA Group Summary Reports for Psychology Classes Fall 2008 Spring 2010 Classes were grouped according to their course numbers; 2000 2999 were the lowerlevel classes (N = 34), 3000 3599 (N = 81), the upper-level classes, and 3600 3799 (N = 23), the seminars and tutorials. Important or Essential Objectives In all three groups of classes, lower, upper, and seminars, gaining factual knowledge was the most often selected objective (97%, 89%, and 83% respectively) and learning fundamental principles or theories was second (82%, 85%, and 61% respectively). Third for all three groups of classes was learning to apply course materials (56%, 70%, and 61% respectively). For the lower and upper level classes these percentages are higher than the institutional ones; these three objectives were selected as important or essential in psychology classes to a greater extent than they were so selected by the faculty in all classes at Stockton and in the IDEA database. The same was not true for seminars and tutorials where psychology faculty more often selected the first objective, gaining factual knowledge than Stockton faculty in general and than the faculty in the IDEA database. The faculty in the seminar classes selected learning fundamental principles less frequently than did other Stockton faculty and they were on-par with the Stockton classes for applying course materials. Sixty-one percent of the seminar classes, 37% of the upper level classes, and 21% of the lower-level classes had developing skill in expressing myself orally or in writing as an essential or important objective. In lower and upper level classes all three groups of psychology classes were below the institutional values in learning to analyze and critically evaluate ideas and points of view ; the seminars and tutorials were about equal to the institutional values. Progress on Relevant Objectives (PRO) The quality of instruction in lower, upper level, and seminar classes in psychology are well above the average of the instruction quality in all the Stockton classes and above the national norms of colleges and universities using the IDEA system. In all three groups of classes students ratings of their progress on the relevant objectives was higher than the IDEA database average. The average for PRO was highest for the seminar and tutorial group (4.5/5) where 100% of the classes were above the IDEA database average and lowest for the lower-level classes (4.2/5) with 85% of the classes above the IDEA database average. Seventy nine percent, 72%, and 90% of the lower, upper, and seminar/tutorial classes respectively were above the Stockton average of Progress on Relevant Objectives.

Excellent Teacher For the lower and upper level classes, the excellent teacher score was equal to the score of the IDEA database with 82% and 78% of the classes respectively, above the database average, and 82% and 74% above the Stockton average. The teachers of the seminar and tutorials classes are rated more highly than teachers in the other two groups; 90% of those classes had excellent teacher scores above both the IDEA and Stockton database averages. Excellent Course The Excellent Course ratings by psychology students for lower, upper level, and seminar/tutorial classes were 85%, 78%, and 100% respectively above the IDEA database averages and 74%, 60% and 91% respectively above the Stockton averages. This indicates that the instructional effectiveness of the psychology classes is very high and well above average. Pedagogy Psychology faculty in the seminar/tutorial classes were, without exception, using the most effective pedagogical strategies for the objectives that they selected for their classes. In 100% of these classes students reported that the appropriate pedagogy was used frequently. In the vast majority of upper-level classes faculty used effective pedagogical strategies according to students ratings. However there were some instances in a few classes (15% or less) of reported infrequent use of teaching behaviors that would have supported the objectives. Some students reported infrequent use of the following strategies displayed a personal interest in students and their learning, explained course materials clearly and concisely, inspired students to set and achieve goals which really challenged them and found ways to help students answer their own questions. In this group of classes there was some report of infrequent use of 19 (of the 20) teaching strategies. In all classes there was frequent encouragement of students to use multiple resources to improve understanding. Students reported that the majority of lower-level classes were using effective pedagogy for the objectives. In 100% of these classes there was use of groups or teams to facilitate learning. In approximately 70% of the classes there was no encouragement of student-faculty interaction outside of class and in 30% of the classes there was no display of personal interest in students and their learning nor was there any encouragement to use multiple resources. In 30% of the classes students reported that they were not frequently given assignments that required original or creative thinking and in more than 20% of the classes students were not inspired to set and achieve goals that challenged them. Faculty Emphases

In 80% of the upper-level classes and in 50% of the lower-level classes faculty reported placing much emphasis on reading. In 8% of the upper-level and 60% of the lower-level classes faculty reported placing much emphasis on critical thinking. The psychology faculty are doing excellent work in instruction. Although there is a higher than usual concentration on knowledge and understanding objectives, there is adequate diversity in the objectives selected in the higher-level domains of learning. Students are more satisfied with the seminar and tutorial courses than with the upper and low-level classes. This is probably a predictable finding because of the differences among these types of classes. Evaluators should be cognizant of this structural difference.