CRITICAL THINKING ASSESSMENT



Similar documents
Annual Goals for Math & Computer Science

Undergraduate Degree Map for Completion in Four Years

Course Selection Recommendations Packet Starting Fall 2010

College of Liberal Arts. Dr. Christina Murphy, Dean Dr. Samuel L. Dameron, Associate Dean

ACT CAAP Critical Thinking Test Results

ACADEMIC ADMISSION REQUIREMENTS FOR ALLIED HEALTH PROGRAMS 2016

Assessing Quantitative Reasoning in GE (Owens, Ladwig, and Mills)

BARBARA R. ALLEN, Dean

Attitudes Toward Science of Students Enrolled in Introductory Level Science Courses at UW-La Crosse

Special Report on the Transfer Admission Process National Association for College Admission Counseling April 2010

IMPORTANT!!! ANY PETITIONS RECEIVED AFTER THE DUE DATE WILL BE PROCESSED FOR THE FOLLOWING SEMESTER.

ACADEMIC ADMISSION REQUIREMENTS FOR ALLIED HEALTH PROGRAMS 2016

Peer Reviewed. Abstract

Operating Procedures (Approved by State Board of Community Colleges on 10/21/11; revised 03/16/12; revised 07/19/13; revised 11/15/13)

Undergraduate Degree Map for Completion in Four Years

Fundamentals of Economics Courses: Fun Course or a Positive Learning Experience?

How To Use The College Learning Assessment (Cla) To Plan A College

WASHBURN UNIVERSITY COLLEGE OF ARTS & SCIENCES. MATHEMATICS Bachelor of Arts (B.A.)

Nassau Community College

Undergraduate Degree Map for Completion in Four Years

Career and College Promise - Pathways College Transfer Career Technical

Use of Placement Tests in College Classes

High School Psychology and its Impact on University Psychology Performance: Some Early Data

WELCOME TO BUSINESS AT UTSA

UNDERGRADUATE APPLICATION AND REQUIREMENTS

A. Courses (12 hours) in the Liberal Arts and Sciences designated to meet proficiency requirements:

Guide to Using Results

College of Education. Bachelor of Science Psychology Major and Comprehensive Psychology Program

SULLIVAN COUNTY COMMUNITY COLLEGE FALL 2006 COLLEGE CATALOG PART 5 OF 8 NURSING

2006 PERFORMANCE REPORT St. Augustine College

ACT Code: FAYETTEVILLE CHRISTIAN SCHOOLS 2006 E MISSION BLVD FAYETTEVILLE, AR 72703

WVU STUDENT EVALUATION OF INSTRUCTION REPORT OF RESULTS INTERPRETIVE GUIDE

Name: Toby Leigh Matoush Campus Phone:

Transfer Students - Frequently Asked Questions 2014

What We Know about College Success: Using ACT Data to Inform Educational Issues

Bryan College of Health Sciences School of Nurse Anesthesia. Plan for Assessment of Student Learning

COLLEGE PREPARATION QUESTIONNAIRE

Undergraduate Degree Map for Completion in Four Years

Undergraduate Degree Map for Completion in Four Years

MAP Reports. Teacher Report (by Goal Descriptors) Displays teachers class data for current testing term sorted by RIT score.

ACT Code: DARDANELLE SCHOOL DISTRICT 209 CEDAR ST DARDANELLE, AR 72834

Required for Review: Ask and answer an assessment-focused question

Tuition and Mandatory Fee Recommendation Eastern Kentucky University

The Effect of Engineering Major on Spatial Ability Improvements Over the Course of Undergraduate Studies

Undergraduate Degree Map for Completion in Four Years

CRIMINAL JUSTICE PROGRAM OUTCOMES FALL 2010 TO SPRING 2011

ACT Code: NASHVILLE SCHOOL DISTRICT N 4TH ST NASHVILLE, AR 71852

Running Head: Promoting Student Success: Evaluation of a Freshman Orientation Course

Understanding Freshman Engineering Student Retention through a Survey

Cluster Templates for Students & Parents

Eagles Taking Flight: Designing a First Year Experience Program at FGCU A Quality Enhancement Plan Proposal. Submitted By:

A COMPARISON OF POST-SECONDARY OUTCOMES FOR TECH PREP AND NON-TECH PREP STUDENTS AT SINCLAIR COMMUNITY COLLEGE. October, 2007

Georgia s Technology Literacy Challenge Fund Grant Application. Bremen City Schools. Section 3: Grant Proposal

Criminology & Criminal Justice Majors. Understanding PSU s Degree Requirements for Undergraduate Students CCJ ONLINE

Business and Public Administration Outcomes Assessment Plan; April 1, 2011 Page 1

Policy Manual. Legal Authority: Fla. Statutes , and

Assessment at Baruch College What Are We Doing & Why?

Analyses of the Critical Thinking Assessment Test Results

Undergraduate Degree Map for Completion in Four Years

GENERAL EDUCATION REQUIREMENTS

Knowledge construction through active learning in e-learning: An empirical study

PERFORMANCE FUNDING STANDARDS, through

Application for Graduation

Section 4 GRADUATION AND DEGREE REqUIREMENTS

SC T Ju ly Effects of Differential Prediction in College Admissions for Traditional and Nontraditional-Aged Students

UNCG School of Nursing Basic Pre-Licensure BSN Degree Information

Undergraduate Degree Map for Completion in Four Years

How do we define college and career readiness?

Lewis College of Business LEWIS COLLEGE OF BUSINESS VISION STATEMENT LEWIS COLLEGE OF BUSINESS MISSION STATEMENT

ACT Code: EAST POINSETT CO SD MCCLELLAN ST LEPANTO, AR 72354

Participation and pass rates for college preparatory transition courses in Kentucky

2013 SAT Report on. College & Career readiness

The Influence of Session Length On Student Success

Undergraduate Degree Map for Completion in Four Years

General Curriculum Information

Department of Political Science. College of Social Science. Undergraduate Bachelor s Degree in Political Science

MCPS Graduates Earning College Degrees in STEM-Related Fields

Oakton Community College Transfer Guide Bachelor of Arts in Early Childhood Education ECED Available at: NEIU Main Campus (Chicago, IL)

Undergraduate Degree Map for Completion in Four Years

College & Career Ready: Believe in Excellence. An overview of college and career readiness programs in East St. Louis School District #189

4.1. Title: data analysis (systems analysis) Annotation of educational discipline: educational discipline includes in itself the mastery of the

Frequently Asked Questions about CGS

Minimum Entrance Requirements The University of North Carolina

When reviewing the literature on

Undergraduate Degree Map for Completion in Four Years

Undergraduate Degree Map for Completion in Four Years

HANDBOOK FOR PRE VETERINARY STUDENTS AT NORTHERN MICHIGAN UNIVERSITY

BACHELOR OF BUSINESS ADMINISTRATION

Forsyth Technical Community College 2100 Silas Creek Parkway Winston-Salem, NC Respiratory Therapy

RELATIONSHIP BETWEEN THE PATTERN OF MATHEMATICS AND SCIENCE COURSES TAKEN AND TEST SCORES ON ITED FOR HIGH SCHOOL JUNIORS

Advising for Students Interested in Veterinary Programs

Undergraduate Degree Map for Completion in Four Years

onlyone GORILL AN ATION Pittsburg State University A GUIDE FOR TRANSFER STUDENTS

Dr. Ryan McLawhon Texas A&M University

Focusing on LMU s Undergraduate Learning Outcomes: Creative and Critical Thinking

In order to gain a better understanding of the math skills

Preferred Learning Styles for Respiratory Care Students at Texas State University San Marcos

Indiana University Kokomo School of Business M.B.A. Program Assessment Report Academic Year

Undergraduate Degree Map for Completion in Four Years

SCHOOL OF NURSING. Baccalaureate in Nursing Degree (BSN) Nursing. Pat O Connor, Interim Dean Greta Kostac, Ph.D, Interim Dean

Transcription:

CRITICAL THINKING ASSESSMENT REPORT Prepared by Byron Javier Assistant Dean of Research and Planning 1 P a g e

Critical Thinking Assessment at MXC As part of its assessment plan, the Assessment Committee decided to look at students in developmental education in comparison with students in capstone courses. The original critical thinking assessment did not focus on these student populations. Enrollment trends at the College indicate that approximately 65% of entering students test into developmental reading and 91% writing courses, and 96% test into developmental math courses. To gather baseline data on developmental students, the committee decided to use the CAAP test. The objective was to determine whether our general education core curriculum contribute to an increase in the skills and knowledge that our students are supposed to acquire in the area of critical thinking. It was decided to use CAAP because it is a standardized, nationally normed assessment program from ACT that enables postsecondary institutions to assess, evaluate, and enhance the outcomes of their general education programs. The test is a 32-item test that measures students skills in analyzing, evaluating, and extending arguments. The Critical Thinking test is composed of three content categories: Analysis of Arguments, Evaluation of Arguments, and Extension of Arguments. To implement this initiative, the committee used two groups of students: Incoming students enrolled in lower level courses, usually developmental courses, were tested during fall 2010 Students in capstone courses in various disciplines and departments were then tested with the same instrument. This phase of the Student Learning Project focused on those students who had completed the majority of their course work at Malcolm X College (spring 2011). Fall 2010 Administration The track 2 project during fall 2010 included 12 sections. Seven sections were from mathematics (Math 098 with 3 sections, and M099 with 4 sections). The English department included 5 sections (ENG 098 with 2 sections and ENG 100 with 3 sections). A total of 272 students participated in the study. This number represented the number of participants needed to achieve the95% Confidence Level at the Margin of Error selected. The results indicated a mean of 55.07 with a standard deviation of 3.937. 2 P a g e

Spring 2011 Administration The spring administration included 17 sections with a total of 350 students participating in the study. Several departments participated in this study: from the general education division, Mathematics and English participated with 6 sections, totaling 173 students enrolled in Math 118 and ENG 102. Career programs included: Respiratory Therapy (15 students), Nursing (80 students), Radiology (25 students), Child Development (10 students), Mortuary Science (15 students), Surgical Technology (10 students), Renal Technology (10 students) and Emergency Medical Technology (12 students). The mean score showed 57.25 with a Standard Deviation of 4.857. Some of the demographic variables that we collected included ethnicity, gender, age, language, education level, enrollment status (full time, part time), educational plans, and GPA. We will further analyze these variables to create benchmarks. Analysis After reviewing the test results, we were concerned about whether or not there was a statistical difference between the score of students in their freshman year and those who were already in higher level courses. When we compared the scores between the two groups, a difference was evident between the lower classes of the distribution and the higher classes. For example, a lower number of students in the higher level courses scored in the second class compared to students in the entering class (freshmen). Likewise, a higher number of students scored higher in the upper class of the distribution. Class Spring 2011 Fall 2010 45-49 3% 5% 50-54 30% 49% 55-59 39% 33% 60-64 20% 11% 65-69 6% 2% 70 > 2% 0% 350 272 3 P a g e

The results indicate that at α = 0.05 levels, the results were statistically significant. The results suggested that there is a clear difference between the two groups. Between Groups Statistics Spring 2011 350 57.25 We also compared students who had been at the institution 6 or more semesters (Spring 2011) with freshmen (Fall 2010) Spring 2011 147 57.40 The results indicated that at α = 0.05 levels, the results were statistically significant. The results suggested that there is a clear difference between the two groups. Students with 4 or 5 semesters at MXC (Spring 2011) were also compared with freshmen (Fall 2010) Spring 2011 51 57.57 The results also indicated that at α = 0.05 levels, the results were statistically significant. The results suggested that there was a distinctive difference between the two groups. Further analysis with the group of students who had been at the institution for at least 3 semesters (Spring 2011) with freshmen (Fall 2010) indicated that there is a statistically significant difference between the two groups Spring 2011 103 56.55 4 P a g e

When we compared students with 2 semesters or less at MXC (Spring 2011) with freshmen (Fall 2010), the findings also indicated a statistical difference. Spring 2011 49 56.22 The results also were compared with national results. These results indicated that we are below the mean for two-year institutions. The national percentage is based on CAAP-tested sophomores at twoyear institutions. Group Mean MXC 57.30 National 60.70 Reflecting on CAAP Critical Thinking Scores Fall 2010 Spring 2011 National mean 60.7 60.7 MXC mean 55.1 57.3 Difference (National minus MXC) 5.6 3.4 National Standard Deviation 5.4 5.4 MXC Standard Deviation 3.9 4.9 Difference (National minus MXC) in standard deviation units 1.5 0.5 National Number of students 26,264 26,264 MXC Number of students 272 350 Overall, results indicate that: There was a difference on MXC students critical thinking skills from a national sample of students at other two-year public institutions. There was a change in students critical thinking skills from students in the lower course levels compared to those in higher level courses. There were meaningful differences in critical thinking skills between different student groups of MXC students. Sample sizes for various student groups were very small and national averages for different student groups were not available; therefore, data by student demographic characteristics should be interpreted very cautiously. 5 P a g e

It is important to note that we had some data limitations to be considered in the interpretation and use of results. For example, students were asked to provide information about their level of effort using the following categories: Tried my best Gave moderate effort Gave little effort Gave no effort No response The report included results of chance scores. These are the predicted scores that students would obtain if they responded to all test questions by guessing. Students who indicated that they tried their best and had a very low score (i.e., below chance level) may in fact lack the skills or knowledge to perform adequately on the CAAP. Graphical Representation 62 61 60 59 58 57 56 55 54 53 52 Comparing National and MXC Results 60.7 60.7 57.3 55.1 Fall 2010 Spring 2011 National mean MXC mean 6 P a g e

6 5 4 Comparing Standard Deviations 5.4 5.4 4.9 3.9 3 2 1 0 Fall 2010 Spring 2011 National Standard Deviation MXC Standard Deviation 7 P a g e

Next Steps Closing the Loop For the analysis of the results and the implementation of new assessment initiatives, we need to remember that these studies are correlational in nature (as opposed to causal). In other words, it is important to keep in mind that there are correlations between scores on critical thinking instruments and hours completed at MXC. As this fact is considered a positive finding; we must remember that we can only assert that both scores increase at the same time. We cannot assert that one causes the other. At the same time, as we review the results and realize that some of them are statistically significant, keep in mind that, when we have a large number of participants, statistical significance becomes easier to find. Thus, we must also look at the effect size. For instance, a critical thinking score may be a significant predictor of grades in a Philosophy course. However, it may only predict 2% of the variance (effect size). Recommendations for maximizing the usefulness of results The following list is not exhaustive and it only represents some general ideas. Each department/program can develop their own goals and objectives regarding critical thinking and its assessment. Examine the curriculum map (General Ed and Careers) to determine where exactly the Critical Thinking student learning outcome is placed. Determine what characteristics critical thinking display at MXC. Identify what courses introduce and reinforce critical thinking skills. Administer the CAAP in courses that enroll a large proportion of students who have completed critical thinking courses. Examine how many test respondents who have taken critical thinking courses score during the new CAAP administration. Compare students who have completed the critical thinking curriculum to those who have not. Compare how students perform on each of the various content areas of the test. Determine the actions to be taken for curriculum development and improvement. Identify strengths and/or weaknesses in specific critical thinking skills (e.g., analysis of arguments, evaluation of arguments, and extension of arguments). For continuous improvement, establish test administrations that target specific level gains and develop an action plan to achieve this goal. Collect data from a larger sample of students (or collect data every other year with a larger sample), in order to have greater confidence in the representativeness of results and to have sufficient data to permit analysis on different student groups. Identify the amount of value added that we expect from students academic performance for a period of five to ten years. 8 P a g e