How To Know If A Student Learns



Similar documents
Assessment METHODS What are assessment methods? Why is it important to use multiple methods? What are direct and indirect methods of assessment?

Assessment at Baruch College What Are We Doing & Why?

Advantages and Disadvantages of Various Assessment Methods

College of Education. Department of Leadership, Educational Psychology and Foundations. MS.Ed in Educational Administration.

Theme: Workforce Development Program Area: Nursing

Degree Program

MSU Departmental Assessment Plan

A Guide. to Assessment of Learning Outcomes. for ACEJMC Accreditation

MPA Program Assessment Report Summer 2015

Methods for Assessing Student Learning Outcomes

2014 EPP Annual Report

College of Education. Learning Outcomes. Program MAED/SPE Assessment Guide. Master of Arts in Education/Special Education

Mission, Goals and Accomplishments. Effective Design and Implementation of Program Assessment Jeff Moore, Director UCF School of Performing Arts

Form 201BC: Assessment Report Form for Instructional Programs Academic Year

Ed.D. Career and Technical Education Assessment Report 2014

How To Be A Successful Transfer Student

Masters in Public Health, MPH Plan for Assessment of Student Learning Outcomes The University of New Mexico

Marshall Community & Technical College

Graduation Rate - Bachelors. Graduation Rate - Masters

Annual Assessment Summary B.S. in Psychology

Computer Engineering (COE) PROGRAM LEARNING OUTCOMES (10/19/09) Students completing the Computer Engineering program should be able to demonstrate:

University Mission School Mission Department Mission Degree Program Mission

Department of Management Information Systems Terry College of Business The University of Georgia. Major Assessment Plan (As of June 1, 2003)

POST-SECONDARY REGIONAL ACCREDITATION ASSESSMENT STANDARDS IN THE UNITED STATES

Program Outcome Assessment Report. Criminal Justice Program. Department of Behavioral Sciences. Southeastern Oklahoma State University

PROFESSIONAL COUNSELOR PORTFOLIO I. Introduction

Annual Program Assessment Review

A Guide. Assessment of Student Learning

EVALUATION METHODS TIP SHEET

Education Specialist (Ed.S.) Program. School Psychology CECH. Primary Faculty: Renee Hawkins

SDA Bocconi Learning Goals

College/School/Major Division Assessment Results for

Third Year Report: Graduate Education Programs

Program Evaluation. Gannon University. Clinical Mental Health Counseling Program. Program Evaluation. Revised 1/29/13

What Is Program Assessment?

Department of Political Science. College of Social Science. Undergraduate Bachelor s Degree in Political Science

Required for Review: Ask and answer an assessment-focused question

Assurance of Learning Assessment Process

SYLLABUS MUSIC BUSINESS SURVEY

Austin Peay State University Annual Program Review (APR) Report. Educational Specialist Program

2007 Assessment Report

Graduation. Nursing. Pre-Nursing

PERFORMANCE FUNDING STANDARDS, through

Outcomes Assessment Year End Report BS Degree

ENVIRONMENTAL HEALTH SCIENCE ASSESSMENT REPORT

Annual Report Accredited Member

Marshall Community & Technical College

Institutional Effectiveness Model

Gettysburg College. Co-Curricular Learning Assessment Plan. Subcommittee on Learning Outside the Classroom. June 15, 2003.

Progranl Outcome Assessment Report. Criminal Justice Program. Department of Behavioral Sciences. Southeastern Oklahonla State University

2011 MASTER PLAN/PROGRESS REPORT

Strategic Planning and Institutional Effectiveness Guide

Institutional Effectiveness Report Academic Year Department of History

Curriculum Proposal Training Assessment Forms Center for Teaching and Learning

Professional Education Unit

Programs With Student Learning Outcomes & Curriculum Map

2009 MASTER PLAN/PROGRESS REPORT

Proposal for Education Specialist Degree (Ed.S.) in School Psychology. Department of Education and Allied Studies John Carroll University

Student Learning Outcomes Assessment in the United States: Trends and Good Practices

Academic Year Learning Outcomes Assessment Plan. Department of Recreation, Park and Tourism Sciences

Bachelors of Science in Education (BSED) Middle Childhood Education. School of Education CECH

CBU - Construction Management Continuous Quality Improvement Process (CQIP)

Western Carolina University Marketing B.S B.A Business Assessment Plan for

Program Quality Assessment. William Wiener

Leah Hughes Capstone Project Faculty Teaching Certificate Program

Arkansas State University College of Nursing & Health Professions Clinical Laboratory Sciences Department

An Information Systems Project Management Course Using a Service-Learning Model

How To Teach A Psychology Course

Social Work Program Assessment, Accreditation, and program improvement

California State University, Stanislaus Doctor of Education (Ed.D.), Educational Leadership Assessment Plan

Computer Science Department s Student Outcomes Assessment Plan

University of South Carolina School of Library & Information Science. A Checklist of Certification Requirements for School Librarian Candidates

LICENSURE PROGRAMS BUILDING LEVEL LICENSURE CENTRAL OFFICE LICENSURE ARKANSAS CURRICULUM/PROGRAM LICENSURE

Master of Science in Sport Management

Peer Reviews of Teaching A Best Practices Guide California State University, Northridge

Department of Accounting, Finance, & Economics

Marshall Community & Technical College

Clinical Mental Health Counseling Outcomes and 2015 Annual Report. In order to fully implement a data- and outcome-based continuous and systematic

ASSESSMENT AND QUALITY ENHANCEMENT

Vision, Mission, and Process

AN INTEGRATED APPROACH TO TEACHING SPREADSHEET SKILLS. Mindell Reiss Nitkin, Simmons College. Abstract

NATIONAL RECOGNITION REPORT Preparation of School Psychologists

Program Assessment Report Bachelor of Business Administration (BBA) Finance

College of Education. Department of Educational Technology, Research and Assessment. Instructional Technology. M.S.Ed.

2006 Assessment Report

2014 Educator Preparation Performance Report Multi-Age (PK-12) TESOL (Teaching English to Speakers of Other Languages Kent State University

DEPARTMENT OF ACCOUNTANCY ASSESSMENT PLAN FOR MASTER OF SCIENCE IN TAXATION (M.S.T.) DEGREE

Annual Goals for Accounting and Business Law

UAF-UAA Joint PhD Program in Clinical-Community Psychology with Rural, Indigenous Emphasis Outcomes Assessment Goals, Objectives, and Benchmarks

ASSESSMENT PLAN UNDERGRADUATE MAJOR IN POLITICAL SCIENCE. Mission Statement

Council for Accreditation of Counseling and Related Educational Programs (CACREP) and The IDEA Student Ratings of Instruction System

I. Student Evaluations (end-of-course evaluation form; a copy of the form is provided in Appendix A on page 58.)

METROPOLITAN COLLEGE. Goals and Student Assessment Outcomes Measures. Graduate Degree Programs

CRIMINAL JUSTICE PROGRAM OUTCOMES FALL 2010 TO SPRING 2011

Annual Report Accredited Member

All Teacher Licensure Tests 74 96%

Master Degree (M.Ed.) Program. Foundations in Behavior Analysis. 18 MED-DL FNBA-DL and 18 MED FNBA CECH

Assessment Plan Undergraduate Major in Accounting J. M. Tull School of Accounting Terry College of Business

Mapping Your Science Education Program. Precollege and Undergraduate Science Education Program

7.1 Assessment Activities

Transcription:

Direct and Indirect Assessments - Data-collection methods for assessment purposes typically fall into two categories: direct and indirect. - Direct evidence of student learning comes in the form of a student product or performance that can be evaluated. - Indirect evidence is the perception, opinion, or attitude of students (or others). - Both are important, but indirect evidence by itself is insufficient. Direct evidence is required. - Ideally, a program collects both types. - Direct evidence, by itself, can reveal what students have learned and to what degree, but it does not provide information as to why the student learned or did not learn. - The why is valuable because it can guide faculty members in how to interpret results and make improvements. Indirect evidence can be used to answer why questions.

Types of Direct Data-collection Methods DIRECT METHODS Licensure or certification Embedded testing or quizzes AIC Level 1 Certification Exam Examples Students' pass rates on the estimating comprehensive final exam. Embedded assignments "Signature assignments" that can provide information on a student learning outcome. Students complete these assignments as a regular part of the course and instructors grade the assignments for the course grade. In addition, the assignments are scored using criteria or a scoring rubric and these scores are used for program-level assessment. Pre- post-tests Employer's or internship supervisor's direct evaluations of students' performances Observation of student performing a task Capstone projects Example: Assignment 7 of the Capstone Project course that deal with estimating cost of the project. When used for program assessment, students take the pre-test as part of a required, introductory course. They take the post-test during their senior year, often in a required course or capstone course. Evaluation or rating of student performance in a work, internship, or service-learning experience by a qualified professional. Professor or an external observer rates each students' classroom presentation using an observation checklist. Students produce a final project that demonstrates their cumulative experiences in the program. The project is evaluated by a team comprised of faculty and construction industry representatives.

Types of Direct Data-collection Methods DIRECT METHODS ADVANTAGES DISADVANTAGES Licensure or certification National comparisons can be made. Reliability and validity are monitored by the test developers. Faculty may be unwilling to make changes to their curriculum if students score low (reluctant to "teach to the test"). An external organization handles test administration and evaluation. Students may not be motivated to prepare and perform well in the test. Embedded testing or quizzes Embedded assignments Pre- post-tests Students motivated to do well because test/quiz is part of their course grade. Evidence of learning is generated as part of normal workload. Students motivated to do well because assignment is part of their course grade. Faculty members more likely to use results because they are active participants in the assessment process. Online submission and review of materials possible. Data collection is unobtrusive to students. Provides "value-added" or growth information. The test fee. Information from test results is too broad to be used for decision making. Faculty members may feel that they are being overseen by others, even if they are not. Faculty time required to develop and coordinate, to create a rubric to evaluate the assignment, and to actually score the assignment. Increased workload to evaluate students more than once. Designing pre- post-tests that are truly comparable at different times is difficult.

Employer's or internship supervisor's direct evaluations of students' performances Observation of student performing a task Capstone projects Evaluation by a career professional is often highly valued by students. Faculty members learn what is expected by industry members. Captures data that is difficult to obtain through written texts or other methods. Provides a sophisticated, multilevel view of student achievement. Students have the opportunity to integrate their learning. Statistician may be needed to properly analyze results. Lack of standardization across evaluations may make summarization of the results difficult. Some may believe observation is subjective and therefore the conclusions are only suggestive. Creating an effective, comprehensive culminating experience can be challenging. Faculty time required to develop evaluation methods (multiple rubrics may be needed).

Types of Indirect Data-collection Methods INDIRECT METHODS Student surveys End-of-course evaluations Alumni surveys Employer surveys Interviews Focus group interviews Job placement data Enrollment in higher degree programs Institutional/Program Research data Example Students self-report via a questionnaire (online, telephone, or paper) about their ability, attitudes, and/or satisfaction. Students report their perceptions about the quality of a course, its instructor, and the classroom environment. Alumni report their perceptions via a questionnaire (online, telephone, or paper). Employers complete a survey in which they evaluate the skills, knowledge, and values of new employees who graduated from the program. Face-to-face, one-to-one discussions or question/answer session. E.g., The program administrator or a trained peer interviews seniors to find out what courses and assignments they valued the most (and why). Face-to-face, one-to-many discussions or question/answer session. E.g., The program administrator or a graduate student leads a focus group of students who are enrolled in scheduling course. The students are asked to discuss their experiences in the course, including difficulties and successes. The percent of students who found employment in a field related to construction within one year. The number or percent of students who pursued a higher degree in the field. Class size data Graduation rates Retention rates Grade point averages

Types of Indirect Data-collection Methods INDIRECT METHODS Student surveys End-of-course evaluations Alumni surveys Employer surveys Interviews ADVANTAGES Can administer to large groups for a relatively low cost. Analysis of responses typically quick and straightforward. Reliable commercial surveys are available. Analysis of responses typically quick and straightforward. Can administer to large groups for a relatively low cost. Analysis of responses typically quick and straightforward. Can administer to large groups for a relatively low cost. Analysis of responses typically quick and straightforward. Provides a real-world perspective. Provides rich, in-depth information and allows for tailored follow-up questions. "Stories" and voices can be powerful evidence for some groups of intended users. DISADVANTAGES Low response rates are typical. With self-efficacy reports, students' perception may be different from their actual abilities. Designing reliable, valid questions can be difficult. Caution is needed when trying to link survey results and achievement of learning outcomes. Difficult to summarize the results across courses. Property of individual faculty members. Low response rates are typical. If no up-to-date mailing list, alumni can be difficult to locate. Designing reliable, valid questions can be difficult. Low response rates are typical. Trained interviewers needed. Transcribing, analyzing, and reporting are time consuming.

Focus group interviews Job placement data Enrollment in higher degree programs Institutional/Program Research data Provides rich, in-depth information and allows for tailored follow-up questions. The group dynamic may spark more information--groups can become more than the sum of their parts. "Stories" and voices can be powerful evidence for some groups of intended users. Satisfies some accreditation agencies' reporting requirements. Satisfies some accreditation agencies' reporting requirements Can be effective when linked to other performance measures and the results of the assessment of student learning (using a direct method). Trained facilitators needed. Transcribing, analyzing, and reporting are time consuming. Tracking alumni may be difficult. Tracking alumni may be difficult.

Evaluate Your Choice of Data-collection Method A well-chosen method: 1. provides specific answers to the assessment question being investigated. 2. is feasible to carry out given program resources and amount of time faculty members are willing to invest in assessment activities. 3. has a maximum of positive effects and minimum of negative ones. The method should give faculty members, intended users, and students the right messages about what is important to learn and teach. 4. provides useful, meaningful information that can be used as a basis for decision-making. 5. provides results that faculty members and intended users will believe are credible. 6. provides results that are actionable. Faculty members will be willing to discuss and make changes to the program (as needed) based on the results. 7. takes advantage of existing products (e.g., exams or surveys the faculty/program already use) whenever possible.