SCHEV Core Competencies



Similar documents
ADVISORY COUNCIL OF PRESIDENTS ACADEMIC, STUDENT AFFAIRS, AND WORKFORCE DEVELOPMENT COMMITTEE APRIL 18-19, 2006

Graduation Requirements

Developing Research & Communication Skills

octor of Philosophy Degree in Statistics

2. SUMMER ADVISEMENT AND ORIENTATION PERIODS FOR NEWLY ADMITTED FRESHMEN AND TRANSFER STUDENTS

California State University, Stanislaus GENERAL EDUCATION: ASSESSMENT CHRONOLOGY

Articulation Agreement for Geographic Information Systems (GIS) Northern Virginia Community College and James Madison University

Indiana University East Faculty Senate

A Status Report on Advancing Virginia Through Higher Education: The Systemwide Strategic Plan For Higher Education in Virginia

General Education and Core Competencies. Paul D. Camp Community College Dir. Assessment & IR General Education Committee 2/11/11

Is the General Education program purpose congruent with the overall mission of the institution?

MA in Sociology. Assessment Plan*

Methods for Assessing Student Learning Outcomes

Indiana Statewide Transfer General Education Core

leaders Master of Education Leadership in Reading UNIVERSITY OF SIOUX FALLS RAPID CITY AREA SCHOOLS

APSU Teacher Unit Annual Program Review (APR) Report: Reading Specialist

DOCTOR OF PHILOSOPHY (Ph.D.) DEGREE PROGRAMS IN EDUCATIONAL ADMINISTRATION with an emphasis in HIGHER EDUCATION ADMINISTRATION

Department of Chemistry University of Colorado Denver Outcomes Assessment Plan. Department Overview:

University of Minnesota Catalog. Degree Completion

Leading Developmental Education Redesign to Increase Student Success and Reduce Costs

Goals & Objectives for Student Learning

MASTER S DEGREE IN EUROPEAN STUDIES

Official Course Outline

Thesis Guidelines. Master of Arts in General Psychology Program. University of North Florida PSYCHOLOGY

FRANCIS J. DOHERTY 3820 Cadet Court Penn Laird, VA Telephone: (540) EXPERIENCE

Brain U Learning & Scientific Reasoning Keisha Varma. Summer 2011

b. A handout for your MATESOL Conference presentation

Annual Goals for Math & Computer Science

Master of Science in Construction Management

Graduate Student Handbook of the Mathematics Department

GRADUATION REQUIREMENTS FOR THE COLLEGE OF ARTS AND SCIENCES, EFFECTIVE FALL 2013

DOCTORAL DEGREE IN EDUCATION. Ed.D. Leadership in Schooling

Master s Degree Portfolio Project

Practicum/Internship Handbook. Office of Educational Field Experiences

FAQs Cal State Online

ASSESSMENT 5: Masters Degree ECE Summative Assessment Project, Thesis or Paper

The Polymath Degree Program

Program Guidebook Applied Behavior Analysis, M.A. - Los Angeles

Department of English Masters of Arts in English Goals and Assessment of Student Learning Outcomes. I. Program Description

Student Learning Outcomes Report

DOCTORAL DEGREE IN EDUCATION. Ed.D. Leadership in Schooling

MiraCosta Community College District programs are consistent with the college mission, vision, and core values.

California State University, Stanislaus Doctor of Education (Ed.D.), Educational Leadership Assessment Plan

Georgia Perimeter College Faculty Senate Course Change

Assessment Report

Delivered in an Online Format. Revised November 1, I. Perspectives

MASTER S OPTION GENERAL GUIDELINES Option 4: Comprehensive Examination

DOCTOR OF EDUCATION (Ed. D.) DEGREE PROGRAM

NEW YORK STATE TEACHER CERTIFICATION EXAMINATIONS

Union County College Faculty Curriculum Committee. Course Revision Form

Executive Summary. Metro Nashville Virtual School

STUDENT LEARNING ASSESSMENT OVERVIEW BUSINESS ADMINISTRATION

AC : WRITING TO LEARN: THE EFFECT OF PEER TUTORING ON CRITICAL THINKING AND WRITING SKILLS OF FIRST-YEAR ENGINEERING STUDENTS

CALIFORNIA STATE UNIVERSITY, SACRAMENTO Division of Criminal Justice Assessment Plan Bachelor of Science in Criminal Justice

PSYCHOLOGY. Master of Science in Applied Psychology

ASU College of Education Course Syllabus ED 4972, ED 4973, ED 4974, ED 4975 or EDG 5660 Clinical Teaching

Tier One: Possess and Exercise Fundamental Knowledge of the Human and Physical Worlds

San Diego City San Diego City College Applied Math Project (AMP) IMPACT ON SYSTEMWIDE NEED SPECIFIC EDUCATIONAL PROGRAM BEING ADDRESSED

CHABOT-LAS POSITAS COMMUNITY COLLEGE DISTRICT GENERAL CRITERIA FOR ASSOCIATE DEGREE AND GENERAL EDUCATION. Prepared by Donald R.

Performing & Visual Arts Magnet High School Program

Doctoral Program in Biology PROGRAM PROCEDURES AND REQUIREMENTS

MEMORANDUM. William E. Knight, Assistant Provost of Institutional Effectiveness

Integrated General Education Instruction and Assessment Model

Department of Accounting, Finance, & Economics

GENERAL EDUCATION REQUIREMENTS

Graduate Handbook of the Mathematics Department. North Dakota State University May 5, 2015

Program Planning Guide Early Childhood Education, Associate in Applied Science Degree (A55220)

Evaluation Plan for Administrative Departments

College of Social and Behavioral Sciences Criminal Justice Program Outcomes Assessments Map

General Education - Critical Thinking and Assessment - Oct. 10, 2014

Program Planning Guide School-Age Education, Associate in Applied Science Degree (A55440)

DOCTORAL DEGREE PROGRAM

WATSON SCHOOL OF EDUCATION UNIVERSITY OF NORTH CAROLINA WILMINGTON

Southeastern Oklahoma State University. Graduate Studies Handbook. for. Master of Education Degree in Reading Specialist

Colorado State University s Systems Engineering degree programs.

Connecticut State Colleges and Universities (ConnSCU) Transfer and Articulation Policy (TAP) Implementation Plan

Dr. Candice McQueen, Dean, College of Education 168 LIPSCOMB UNIVERSITY

Ohio s State Tests Information for Students and Families

Doctoral Programs in Communication Sciences and Disorders

Doctor of Education Higher Education with Concentration in Community College Administration Program Handbook

STUDY GUIDE. Illinois Certification Testing System. Library Information Specialist (175) Illinois State Board of Education

Philosophy. Admission to the University. Required Documents for Applications. Minimum Requirements for Admission

DOCTOR OF EDUCATION IN EDUCATIONAL LEADERSHIP AND CERTIFICATION OPTIONS FOR PRINCIPAL K 12 AND SUPERINTENDENT S LETTER OF ELIGIBILITY

MASTER OF EDUCATION 1. MASTER OF EDUCATION DEGREE (M.ED.) (845)

MC-ECO Master of Economics

GUIDE TO EVALUATING INSTITUTIONS

Doctor of Education Higher Education with Emphasis in Community College Administration Program Handbook

Annual Assessment Summary B.S. in Psychology

How To Accredit A Psychology Program At The University Of Melbourne

Part I Program SLO Assessment Report for

Evaluating Assessment Management Systems: Using Evidence from Practice

The University of Connecticut. School of Engineering COMPUTER SCIENCE GUIDE TO COURSE SELECTION AY Revised July 27, 2015.

Education Honors Program

MEDICINE, DOCTOR OF (M.D.)/ PUBLIC HEALTH, MASTER OF (M.P.H.) [COMBINED]

Department of Business Administration, Management, & Marketing

The New Pasadena City College Librarian: A Summary

Liberal Arts Education in the Professions. Rick Detweiler October 24, 2013

Industrial Technology (BS/IT & MS/IT progs.) SECTION IV DRAFT (July 6, 2011)

CSM. Psychology, BS. 3. Students can demonstrate knowledge and understanding in history and systems, research methods, and statistics.

NORTHERN ILLINOIS UNIVERSITY. College: College of Business. Department: Inter-Departmental. Program: Master of Business Administration

Transcription:

SCHEV Core Competencies The Governor's Blue Ribbon Commission on Higher Education (February 2000) recommended that the State Council of Higher Education for Virginia (SCHEV) should coordinate the efforts to measure academic quality. As a result, assessment of the following six core competencies is required by VCCS and SCHEV: 1. Written Competency 2. Information Literacy Competency 3. Quantitative Reasoning Competency 4. Scientific Reasoning Competency 5. Critical Thinking Competency 6. Oral Communications Competency Definition of Competencies according to SCHEV: Written Competency The faculty recognizes that students pursuing occupational/technical or transfer degrees in the Virginia Community College System must develop effective writing skills in order to succeed in the workplace as well as in future academic pursuits. Effective written discourse by these students must demonstrate that they possess the ability to develop and express complex ideas clearly, coherently, and logically in a style appropriate for both purpose and audience. This written discourse must also demonstrate that the students have mastered accepted standards of written communication. Therefore, competent writing will demonstrate mastery in focus, content, organization, style and conventions. Information Literacy Competency Information literacy is a set of abilities requiring individuals to "recognize when information is needed and have the ability to locate, evaluate and use effectively the needed information." Quantitative Reasoning Competency A person who is quantitatively literate (i.e., competent in quantitative reasoning) possesses the skills, knowledge, and understanding necessary to apply the use of numbers and mathematics to deal effectively with common problems and issues. A person who is quantitatively literate can use numeral, geometric, and measurement data and concepts, mathematical skills and principles of mathematical reasoning to draw logical conclusions and to make well-reasoned decisions. Scientific Reasoning Competency Scientific reasoning is characterized by adherence to a self-correcting system of inquiry, the scientific method, and reliance on empirical evidence to describe, understand, predict and control natural phenomena. Critical Thinking Competency Critical thinking is deciding what to believe and how to act after careful evaluation of the evidence and reasoning in a communication.

Oral Communication Competency It is recognized that oral communication refers to a variety of forms for spoken discourse -- public speaking, small group communication, interpersonal communication and interviewing. Therefore, for this purpose oral communication is defined as the ability to deliver a spoken message of significant depth and complexity in a way that elicits an immediate response from an audience of understanding, appreciation, assent or critical inquiry. Oral communication requires of the communicator a demonstrated grasp of general purpose and specific occasion; the effective organization of material with emphasis on key ideas; the stylistic use of vivid and clear language as well as vocal and bodily expressiveness; and meaningful, appropriate and sustained engagement with the audience. The results of the competency assessments will be included in the Reports of Institutional Effectiveness (ROIE) and institutional progress reports required for the System-wide Strategic Plan. The competency assessment reports are phased over with two competencies reported each year. Phase 1-2003 Phase 2-2005 Phase 3-2007 Writing Quantitative Reasoning Oral Communication Information Literacy Scientific Reasoning Critical Thinking 1. Writing Assessment The Office of Assessment generated a random sample of 10% of NOVA's ENG 111sections to participate in this assessment. The sample of 143 student papers represents approximately 5% of the 4,225 program placed students enrolled in ENG 111 during the Fall 2002 semester. VCCS Writing Assessment Workshop Background: A pilot project to evaluate the writing skills of students enrolled in English 111 was completed in winter 2002. The pilot was based on the writing competency plan that was developed by the VCCS Task Force on Assessing Core Competencies. The plan called for holistic scoring of a 5% random sample of papers from program-placed students that would be written near the end of English 111 courses. The results of the pilot project demonstrated that standard writing prompts would improve the inter-rater reliability of the holistic scoring process. Three prompts that were successfully piloted in summer 2002 were distributed to faculty. In fall 2002, faculty were requested to choose prompts, and to request 350-500 word papers, written outside of class, from students in English 111 classes. The randomly selected papers were due in the System Office in January 2003. Colleges were asked to make sure that any student/faculty/college-specific identifiers had been removed before the papers were forwarded to the System Office. Update: On March 28-29, 2003, one English faculty member from each college came to Richmond and participated in a writing assessment workshop that was designed to score the 5% sample of English 111 papers. Dr. Carole Bencich, a holistic scoring expert and professor of English at Indiana University of Pennsylvania, conducted the workshop. Using a scoring grid that consisted

of 6 categories, ranging from clear and consistent competence to incompetent, Dr. Bencich used papers from the VCCS sample to illustrate the scoring process. Two faculty read and scored each of the 778 papers in the sample. For papers that had more than a one-point discrepancy in scores, a third reader scored the paper. The scores were averaged to produce the final score. Only 125 papers (16%) required third readers, producing an inter-rater reliability of 84%. The VCCS is required to report actual results on student writing competence to the State Council. These results will be published with those from the four-year institutions on the SCHEV website with the reports of institutional effectiveness. Prior to the submission to the State Council, the System Office will forward college-specific results to each college. Action Required: This item is for information only. The committee is asked to discuss the format for the results and data that will be sent to each college. Additional data and information will be provided at the meeting. Resource Person: Dr. Toni Cleveland, Vice Chancellor Academic Services and Research RESULTS Total Papers Mean Scores Median Scores VCCS 778 3.94 4.00 NOVA *138 3.81 4.00 *The numbers are different than the number of papers NOVA submitted to the System Office for two reasons: Our students' mean score (3.81) was slightly below the VCCS mean (3.94), which was a little below "adequate" (4.0). The VCCS was informed that our English faculty did not believe this was truly an assessment of our graduates' ability to write. It's an assessment of students completing ENG 111. VPs from other colleges agreed. The System Office stated that they were looking into a different approach to assessing graduates' competencies. 2. Information Literacy Assessment The Information Literacy Assessment Plan was developed by a task force whose members represented college staff, testing centers, distance learning specialists and transfer and occupational/technical faculty. In addition, two members of the English faculty who could not attend the meeting submitted input. The Information Literacy Assessment Plan was developed by a task force whose members represented college staff, testing centers, distance learning specialists and transfer and

occupational/ technical faculty. In addition, two members of the English faculty who could not attend the meeting submitted input. The Plan The NOVA plan is outlined according to the Information Literacy Guidelines 2002. 1. Identify where in the associate degree curriculum the information literacy competencies will fit, especially the application requirement. Information literacy competencies are taught across the curriculum, in both general education and occupational/ technical courses. With over 75 degree programs and specializations, there is no one place in which information literacy is formally taught. The task force feels that this is appropriate. Ideally, each program would have a capstone course in which students would think about how all their courses contributed to their education and in which assessments such as this could be given in order to continually improve the program. At this time, information literacy is integrated into the General Education goals and is partially covered in courses that students are encouraged take early in their program, such as ENG 111-112 and STD 100. 2. Identify how the college plans to assess its students (i.e., lab, assessment day, class/course requirement). To get the most meaningful cross-section of students, a random sample of 25% of the students applying for graduation in the Spring 2003, who are also enrolled in classes for the Spring 2003 semester, will be asked to take the test. Letters will be mailed to these students requesting their participation in the assessment. The letter will explain why they should take the test, refer to Testing Center hours of operation, and include a test pass, which will be presented to Testing Center staff. Any student with a current accommodation form on file with the NOVA Office of Students with Disabilities will be encouraged to contact Testing Centers prior to taking the assessment if he/she needs accommodations. Students who pass the test will receive a certificate stating that they passed the information literacy assessment test. Testing will occur mid March through mid April. 3. Estimate the number of students that will be tested. Assuming that approximately 2,300 students will graduate in 2002-03, and assuming that approximately 70% of them will apply for graduation and be enrolled in the Spring 2003 semester, 1610 students would be eligible to take the test. The 25% random sample would include 403 students (or 17.5% of 2002-03 graduates). Realistically, it is unlikely that more than half of these students will decide to take the test. Even if only one fourth of them participate, NOVA would still test over 100 students. 4. Recommend an individual interested in exploring the idea of integrating the demonstration application with the critical thinking and oral communication competencies. Members of NOVA's Assessment Committee would be interested in this project. We think the system-wide task force should include assessment personnel, as well as both occupational/ technical and transfer faculty. Name Title Why Included Dr. Max Bassett Jennifer Egan VP, Academic & Student Services Reference Librarian, LO Provide overview of feasibility of various options Experienced in assessment, knowledgeable about information literacy assessment in particular

Mike Ghorbanian Gale Leonard Dr. Charles Pumpuni Kevin Reed Dr. Sheri Robertson Joan Trabandt Maggie Zarnosky Professor of Drafting, AL Testing Center Coordinator, AN Associate Professor of Biology, AL Assistant Professor of IST, AL Associate VP, Curriculum & Enrollment Services Instructional Technologist, ELI Associate Director, Learning Resource Services, AL Chairs the Drafting Faculty Cluster, represents occupational/ technical faculty Runs the largest testing center, has followed core competency requirements, knows about basic skills assessment Chairs the Biology Faculty Cluster, represents transfer faculty Chairs the IST Faculty Cluster, represents occupational/technical faculty, several colleges plan to give the test in an IST course Responsible for academic assessment, attended both regional meetings on core competency assessment Long-time member of the Assessment Committee, represents distance learning Served on Gene Damon's System-wide task force on information literacy, coordinated the pilot of the test at NOVA and knows a great deal about the test Results: The VCCS implemented the test differently than JMU. JMU students could take the test at anytime they wish. The VCCS used the test as a one-time assessment. In most cases students were not given a choice unless there was a technical problem. Each VCCS college submitted a testing plan in November, 2002 with the intent that students in all programs will be tested. The plan fell into three types: Nine colleges chose to test graduates. Six colleges chose to test students in English 112 and mandated that the test as part of the course work. Eight colleges chose one or another of their courses that would involve students from all programs. The test administered on-line was made available to the colleges between January 15 and May 9, 2003. At the same time, we released a tutorial that students could use to prepare for the test. Number Tested Average Score Median Score % Met or Exceeded Standard % Did Not Meet Standard % Highly Competent % Met Standard VCCS 3678 36.4 37 53.2% 46.8% 28.7% 24.4% NOVA 52 41.7 42 75% 25% 53.9% 21.2%

The test questions cover six competency areas. NOVA scores compared with the system-wide scores for each area are listed below: Choosing right source/general library and bibliographic skills (19 items) 14.3 or 75.0% 65.7% Database searching (21 items) 15.3 or 72.8% 61.4% Using Internet sources (11 items) 9.4 or 85.0% 76.4% Ethics (3 items) 12.2 or 76.3% 67.0% Knowledge items 29.5 or 77.5% 67.7% Information Literacy Test Area Results for NOVA 2003-2004 Number of Respondents: 9 Overall Mean Score: 63.3 Overall Median Score: 65.0 Proficient: 55.6% Advanced: 0.0% Test Areas 1. Determines the nature and extent of the information needed (12 items) 2. Accesses needed information effectively and efficiently. (19 items) 3. Evaluates information and its sources critically and incorporates selected information into his or her knowledge base and value system (19 items) 4. Uses information effectively to accomplish a specific purpose (0 items) 5. Understands many of the economic, legal and social issues surrounding the use of information and accesses and uses information ethically and legally (10 items) Average Median VCCS Average 8.7 9.0 8.7 8.7 9.0 8.5 13.7 15.0 13.2 - - - 7.3 7.0 6.6 VCCS Total : Proficient = 45.1% Advanced = 1.0%

3-4. The Quantitative Reasoning and Scientific Reasoning Assessment A pilot test to measure scientific and quantitative reasoning was administered to students who were ready for graduation. It was administered by JMU on their server. The test was 30-45 minutes long. Ideally all the graduating students should have been tested. However, NOVA randomly selected 5% of students which was approximately 140 students. The test was conducted during the second and third weeks of April, 2004. Some students could not even log on to the server. In some cases the tests locked up and they had to restart. Sixteen students total participated institution-wide. The results are as follows: Quantitative Reasoning (36 items) Scientific Reasoning (35 items) Number of Students 13 13 Average 14.4 (40%) 21.0 (60%) Median 13.0 21.0 VCCS Average 13.0 (36%) 19.97 (57%) The Core Competency Working Group at NOVA is currently reviewing our experiences from the past and better ways to administer Quantitative Reasoning and Scientific Reasoning Tests. 5. Critical Thinking In the spring of 2006, the VCCS licensed the use of the CCTST from Insight Assessment and conducted a system-wide pilot to assess students critical thinking skills. A total of 2045 students from the system s 23 colleges participated. In the spring of 2007, between March 15th and May 15th, all 23 VCCS colleges administered the online assessment to students who had applied to graduate. In most cases, students were not given a chance to retake the test unless there were technical problems when the student attempted the test. A total of 2,432 graduating students successfully completed the assessment. Each student who completed the CCTST had an opportunity to review an individualized critical thinking profile outlining his or her performance on each of the measured subscales. Results Critical Thinking Competence N % Met or exceeded standard 2,157 (out of 2,432) 88.7

Summary In the spring of 2007, a total of 2,432 graduating students from the Virginia Community College System s 23 colleges took the California Critical Thinking Skills Test. Nearly 90% of all the students met or exceeded the standard established by the VCCS. Because students generally did not get to take the test more than once, we regard this as a more than acceptable result. This year s results will provide us with an excellent baseline with which to compare future results, as well as, provide the VCCS and its colleges with data on students performance in the areas of inductive reasoning, deductive reasoning, analysis, inference and evaluation. Results from the spring 2006 pilot assessment and the spring 2007 assessment administration will also enable the VCCS to conduct an independent standard setting for critical thinking competency. 1 The Complete American Philosophical Association Delphi Research Report is available as ERIC Document Number: ED 315 423 (c) 1990, The California Academic Press, 217 La Cruz Ave., Millbrae, CA 94030. 6. Oral Communication In the spring of 2006, the VCCS licensed the use of the TOCS from James Madison University and conducted a system-wide pilot to assess students oral communication skills. A total of 2075 graduating students from the system s 23 colleges participated. In the spring of 2007, between March 15th and May 15th, all 23 VCCS colleges administered the online assessment to students who had applied to graduate. In most cases, students were not given a chance to retake the test unless there were technical problems when the student attempted the test. A total of 2,447 graduating students successfully completed the assessment. Results Oral Communication Competence N % Met or exceeded standard 2,071 (out of 2,447) 84.60 Summary In the spring of 2007, a total of 2,447 graduating students from the Virginia Community College System s 23 colleges took the Test of Oral Communication Skills. Nearly 85% of all the students met or exceeded the standard established by the VCCS. Because students generally did not get to take the test more than once, we regard this as a more than acceptable result. This year s results will provide us with an excellent baseline with which to compare future results. Results from the spring 2006 pilot assessment and the spring 2007 assessment administration will also enable the VCCS to conduct an independent standard setting for oral communication competency.