Sampling, Recruitment, & Motivation



Similar documents
A Guide. to Assessment of Learning Outcomes. for ACEJMC Accreditation

Repayment Resource Guide. Planning for Student Success

Kean University School of General Studies General Education Mentor (GEM) Peer Leadership Program Undergraduate Student Employment Application

College Timeline for 9 th to 11 th Grade Students

How To Use The College Learning Assessment (Cla) To Plan A College

Graduate Handbook of the Mathematics Department. North Dakota State University May 5, 2015

Assessment METHODS What are assessment methods? Why is it important to use multiple methods? What are direct and indirect methods of assessment?

Online Graduate Certificate in Supply Chain Management

Case Study of Middlesex Community College and SmarterMeasure

Environmental Science/ Environmental Geology M. S.

YSU Program Student Learning Assessment Report Due Date: Thursday, October 31, 2013

Team Recruitment - Overview Recruitment Strategies What is Enactus? What is Enactus? Talking Points... 6

Encouraging Faculty Participation in Student Recruitment

MARINE BIOLOGY INTERDISCIPLINARY GRADAUTE PROGRAM HANDBOOK. for

CRIMINAL JUSTICE PROGRAM OUTCOMES FALL 2011 TO SPRING 2012

CourseShare Manual. Resources for Departments and Faculty

A Sample Schedule for Graduate School Admission

Student Course Evaluation Management and Application

Results of the Spring, 2011 Administration of the National Survey of Student Engagement at New Mexico Highlands University

CRITICAL THINKING ASSESSMENT

Administrative Policy #39-03 ( 2015) Examination of Written Competency. Responsible Office: Institutional Research, Assessment and Planning

Policies for Evaluating Faculty: Recommendations for Incorporating Student and Peer Reviews in the Faculty Evaluation Process DRAFT

Family and Consumer Sciences Education (FCSE) Doctor of Philosophy

Developmental Psychology Program Graduate Degree Requirements: Outline Revised 10/21/14 REQUIRED DEVELOPMENTAL COURSEWORK

Policies for Evaluating Faculty: Recommendations for Incorporating Student and Peer Reviews in the Faculty Evaluation Process

NORTHERN ILLINOIS UNIVERSITY. College: College of Business. Department: Inter-Departmental. Program: Master of Business Administration

First-Year Seminar Proposal

College. Of Education

Frequently Asked Questions (FAQs)

DATA SAY. Proficiency Profile Results at ASU - Jonesboro. Sample Population Students were given a 1/10 chance to receive $100 on their AState Cards

Graduate Student Handbook of the Mathematics Department

Colorado State University s Systems Engineering degree programs.

How To Prepare For Graduate School

Program Outcome Assessment Report. Criminal Justice Program. Department of Behavioral Sciences. Southeastern Oklahoma State University

Practicum/Internship Handbook. Office of Educational Field Experiences

STRATEGIC PLAN

CRIMINAL JUSTICE PROGRAM OUTCOMES FALL 2010 TO SPRING 2011

Healthy People 2020 and Education For Health Successful Practices in Undergraduate Public Health Programs

Summer Semester, First Year SOPSY 660, Contemporary Social Psychology, is frequently offered during the summer.

The Academic and Co-Curricular Experiences of UCSD Freshmen Students Evidence from the Your First College Year Survey

Fall 2009 Fall 2010 Fall 2011 Fall 2012 Fall 2013 Enrollment Degrees Awarded

PREPARING FOR DENTAL SCHOOL: A GUIDE FOR FRESHMEN AND SOPHOMORES

Course Registration. Policy Statement. Entities Affected by the Policy. Policy Background

ACT CAAP Critical Thinking Test Results

Key components of a successful remedial/developmental education program include an effective organizational structure, mandatory assessment and

Chapter 14: Request for Institutional Change

Goals & Objectives for Student Learning

WVU STUDENT EVALUATION OF INSTRUCTION REPORT OF RESULTS INTERPRETIVE GUIDE

NAVARRO COLLEGE. Dear Student:

Headi. Honors College Staff. Honors College Student Handbook CONTENTS

School of Criminal Justice & Social Sciences Assessment Plan BA Psychology Major

Headi. Honors College Staff. Honors College Student Handbook CONTENTS

Qualitative and Quantitative Evaluation of a Service Learning Program

Perceived Stress among Engineering Students

Delta State University School of Nursing Quality Enhancement Plan Report

We are an SCUSD Open Enrollment School

FORDHAM UNIVERSITY OFFICE OF ALUMNI RELATIONS AFFINITY CHAPTER HANDBOOK

IT Investment and Business Process Performance: Survey Questionnaire

Barbara M. Wheeling Montana State University Billings

MARSHALL UNIVERSITY HONORS COLLEGE POLICY HANDBOOK

California State University, Stanislaus Doctor of Education (Ed.D.), Educational Leadership Assessment Plan

Ten Strategies to Encourage Academic Integrity in Large Lecture Classes

The 5 Keys to Successful Fundraising by Sandra Sims

Masters (M.S.) in Instructional Systems Department of Educational Psychology & Learning Systems Florida State University

University Career Services Human Resources Department Showcase. November 17, 2015

Understanding Freshman Engineering Student Retention through a Survey

College Timeline for the Class of 2017

Stakeholder Engagement Working Group

Frequently Asked Questions about CGS

Does College Matter? Measuring Critical-Thinking Outcomes Using the CLA

QUALITY ASSURANCE POLICY

College. Of Education

Action Planning: Developing a Plan

State Licensure for Teachers, School Counselors and Administration Professionals

Researching and Choosing a School

PERFORMANCE FUNDING STANDARDS, through

Pasco-Hernando Community College Operational Guidelines

Overview of USI s College Achievement Program (CAP) for Prospective Instructors and School Administrators

-Your School of Choice. The Rutgers School of Business Camden. MASTER of BUSINESS ADMINISTRATION PROGRAM

College of Nursing and Health Sciences Strategic Goals and Objectives 2013-

The Florida State University Department of Statistics Graduate Handbook. November 12, 2013

Department of Chemistry Learning Outcomes Assessment Report

The Value of the Library in the Teaching & Learning Process. Megan Oakleaf, MLS, PhD ARL Assessment Forum January 2011

MASTER OF PUBLIC HEALTH STUDENT MANUAL

ACADEMIC POLICIES MPH AND MSPH PRACTICUM GUIDELINES

Liberal arts,,,.,. liberal arts.

Progranl Outcome Assessment Report. Criminal Justice Program. Department of Behavioral Sciences. Southeastern Oklahonla State University

Phoenix College Paralegal Studies Program Assessment Plan

octor of Philosophy Degree in Statistics

Psychological Science Strategic Plan February 18, Department of Psychological Science Mission

Taking Student Retention Seriously

STANDARD #2. Strategic Planning

Rethinking the Haverford College Chemistry Department: Curriculum and Teaching Methods

UCC/UGC/ECCC Proposal for Plan Change or Plan Deletion

Internal Quality Assurance Arrangements

BAT CAMPUS AMBASSADOR PROGRAM CONNECT - NETWORK - COLLABORATE - LEARN

Creating Quality Developmental Education

Preschool to 3rd Grade (P-3) Early Childhood Education Teacher Preparation Program

Division of Undergraduate Education Strategic Plan Mission

MISSION STATEMENT CURRICULUM

Transcription:

Sampling, Recruitment, & Motivation SAMPLING Purposeful sampling of students is an important part of obtaining valid and reliable results. While each institution is responsible for student sampling and recruitment, CLA+ staff are available to assist and discuss a sampling and recruitment plan that best suits a particular campus. CAE has two primary report types, each with distinct sampling requirements: cross-sectional results and mastery results. GENERAL SAMPLING RULES To attain statistically reliable and representative results, CAE recommends sampling at least 100 students, or 25-50% of the population size for each class level tested. Institutional results will only include students who have completed the entire assessment This includes providing scorable CLA+ Performance Task response and completing the demographic and motivation surveys provided after the exam. Cross-sectional results include growth estimates (in the form of effect sizes and value-added scores) and normed data, such as percentile rankings. Cross-sectional reports also include summary scores, subscores, mastery levels, and more. Institutions that wish to receive cross-sectional results will need to follow the sampling requirements outlined in this table. Mastery results include statistics only for the students tested within a specific administration; as such, they do not include growth estimates or normed data and have less stringent sample requirements. These results include summary scores, subscores, mastery levels, and more. Students do not need to test within a specific administration or have SAT/ACT/SLE scores in order to be included in the institutional sample for this type of reporting. REPORT RESULTS FALL TESTING REQUIREMENTS SPRING TESTING REQUIREMENTS Value-Added Scores* (Must test in fall and spring) Effect Sizes (Must test in fall and spring) Percentile Rankings (Can choose to test in fall or spring) Exiting students (seniors: ideally, non-transfers) Any non-freshmen (sophomores, juniors, seniors: ideally non-transfers) Any non-freshmen (sophomores, juniors, seniors: ideally non-transfers) *Value-added scores estimate growth across the full college experience. Due to the limited number of community colleges participating in CLA+, this statistic is only available for 4-year colleges and universities. Copyright 2016 Council for Aid to Education 1

Institutions planning to administer CLA+ to the same students longitudinally are encouraged to sample at least 300 students as freshmen during the first fall administration. Please contact your CLA+ Rep if your institution would like further instructions on how to construct a self-guided longitudinal study. SAMPLING STRATEGIES Institutions should try to approximate a true random sample. Below are examples of two sampling strategies employed by two different institutions. Sampling and recruitment strategies, as well as testing conditions, should be standardized across fall and spring administrations. SIMPLE RANDOM SAMPLE: 1. Obtain a list of 1,000 registered students (name, student ID, etc.) from the Registrars Office. 2. Generate a list of random numbers (e.g., in Excel). 3. Attach the random numbers list to the list of students. 4. Sort students by these random numbers. 5. Select every fifth student, assuming half the 200 students selected will participate in CLA+. NON-RANDOM, BUT REPRESENTATIVE, SAMPLE: 1. Identify six courses (e.g., Psychology, English, Art History, Economics, Biology, Statistics) covering a range of disciplines that reach a representative cross section of the 700 standard students at your school. Make sure not to exclude any major demographic (gender/race) or field of study (social sciences, natural sciences, liberal arts), and confirm that all groups are proportionally included. 2. Sample within these courses and aim for student counts in proportion to course enrollment sizes. For example, if Biology has 200 students, 200/700 is approximately 29%, so 29 students are from Biology. If the Sociology Department only has 100 students, 100/700 is approximately 14%, so to test 14 of these students. Continue sampling in this way until your institution has met its target sample size. FACULTY SUPPORT & INVOLVEMENT Embedding CLA+ into courses or campus activities is an effective way to ensure student participation. Some examples include administering CLA+ during orientation seminars or capstone courses. Faculty involvement is crucial. OBTAINING FACULTY SUPPORT Participating institutions have emphasized the importance of obtaining faculty support of the CLA+. For example, institutions methods for garnering support include: obtaining permission from faculty members to post bulletins in classrooms, asking faculty to make announcements that encourage student participation, or requesting use of class time to administer CLA+. Sharing information about CLA+ in advance of student recruitment helps to foster a collaborative relationship with faculty. Some schools have also organized an Assessment Day. This day could include CLA+ as well as other student activities, such as course evaluations or program assessments. PROMOTING FACULTY ENGAGEMENT Hold an information session so faculty will be more invested in CLA+ and truly understand why the institution is participating. If your institution has participated in the past, share the results. Faculty are stakeholders in CLA+ results. Encourage faculty and staff to view an example of a CLA+ task online by visiting: http://cae.org/cla-sampletasks. Invite faculty to participate in a Performance Task Academy. For more information, visit http://cae.org/pta. 2

STUDENT RECRUITMENT TECHNIQUES INTERNAL MANAGEMENT & PLANNING Brainstorm with faculty to discuss recruitment methods by asking: What methods have worked in the past? What methods work best for the culture of your institution? Talk to staff in Admissions and at other campus offices, such as Institutional Research or Career Services, and ask for suggestions on how to engage students on campus. Use online event management tools, such as Eventbrite http://www.eventbrite.com/, to promote CLA+ testing, send reminders, and track student attendance. Consider Remote Proctoring services for students that are not available to test on campus RECRUITMENT METHODS & TECHNIQUES Plan an awareness/advertising campaign. Send campus e-blast emails. These can include links to the CLA+ available at http://cae.org/claguides. Students better understand their role in testing and how CLA+ results can help the institution as a result. Post signs in high-traffic areas around campus, or use social media to promote CLA+. Share ways in which students can use their student score reports: students can supplement their graduate school applications with CLA+ scores, students can share their results with prospective employers as evidence of their critical-thinking skills, and students can use their results to identify areas where they are excelling or may want to focus their efforts. Host a campus-wide Assessment Day at your institution. Incorporate other activities on this day that would benefit your institution, such as surveys or qualitative feedback for the institution or its departments. Share how your school uses the assessments to make improvements to curriculum. Create a webpage explaining CLA+ and how involvement will affect your school. You could also include past institutional results. For freshmen, have a sign-up table during orientation. We suggest that you: Reserve in advance all computer labs for specific testing dates. Have students sign up for a date and time on the spot. Take their email addresses and/or phone numbers and inform them that they will be contacted twice: first to confirm that they signed up, and again as a reminder closer to the actual test date. Have a successful recruiting strategy? Share it with us. INCENTIVES FOR STUDENTS Embedding CLA+ in campus events may not work for all institutions. Depending on your campus culture, you may want to offer incentives to your students to encourage participation. Experience from most campuses suggests that simply asking students to voluntarily participate in CLA+ is not as successful as other recruitment strategies. The following is a list of strategies used to recruit students for CLA+. It is ordered by preference of freshmen who were surveyed. Please note that successful recruitment practices vary across institutions. MOST PREFERABLE Course registration preference for the next semester In-kind payment (discount of $25 at book store, gift certificate, etc.) Prizes or raffles (1 in 10 chance of winning an ipad) Cash incentives ($25) Inclusion of CLA+ score on résumé MODERATELY PREFERABLE Obtaining feedback on personal strengths and weaknesses by faculty (professor, advisor, etc.) or school administrator (president, dean, provost, etc.) Helping school assess student learning Reviewing student performance as it compares to peers Acknowledging of participation in printed materials (newsletter, annual report, etc.) 3

OTHER IDEAS FROM CLA+ CAMPUSES Priority seating and/or extra tickets to graduation Providing snacks on the day of testing Recognition for participating during graduation Reserved parking permit University paraphernalia (T-shirts, pens, etc.) USB flash drives for all participants Participation and results noted on student transcripts Priority placement in housing lottery CLA+ strongly discourages the use of punitive recruitment and motivational techniques. RECRUITMENT CORRESPONDENCE It is important to designate one coordinator to manage all aspects of CLA+ participation from initial communications to recruiting students, determining the testing schedule, securing computer facilities, and coordinating student data collection. WHO SHOULD SEND CLA+ CORRESPONDENCE? Students are more likely to read messages from people they know and respect. To avoid the perception that CLA+ recruitment is a bureaucratic request, you might select the president or provost s office, student advisors, instructors, department heads, student leaders, and/or thesis advisors to send the recruitment letters. RECRUITMENT CORRESPONDENCE RESOURCES See below for a sample email to students. Email templates and a correspondence timeline are available upon request. CAE can also provide flier templates that can be adapted and display around campus. TIPS Personalize correspondence (avoid dear student salutations or multiple recipient addresses/listserv origin for emails) so as not to suggest a mass mailing approach. Keep messages brief. Inform students of alternative ways to respond (e.g., email, phone call). Emphasize the benefits of participation. Reference incentives (if your institution has chosen to use them). THE COLLEGIATE LEARNING ASSESSMENT (CLA+) (MM/DD/YY): PRE-NOTIFICATION Sender Name [mailto:sender@universitycollege.edu] To: Jane Doe (not a listserv, not multiple recipients) Dear Jane, You have been selected to represent University College in the Collegiate Learning Assessment on MM/DD/YY. The Collegiate Learning Assessment (CLA+) is an open-ended assessment that provides a measure of your own development of critical thinking and written communication skills. For more information about CLA+, visit http:// cae.org/cla-student-guide/. Please RSVP to participate using the information found at the bottom of this email. The principal goal of CLA+ is to assist faculty and administrators in improving teaching and learning. CLA+ measures your analytic reasoning, problem solving, and written communication skills among others. You will be provided a score report that will compare your performance to other students at your institution and nationwide. Your participation will also help University College gauge its performance in helping you develop these skills and, in the future, use this data to focus its resources on what works. Sincerely, Sender 4

MOTIVATION Campuses are often concerned about how to motivate their students-not only to take CLA+, but also to perform well. New survey questions provide schools with data about their students effort on CLA+, and the results are delivered to institutions in their reports. Institutions play a key role in fostering a culture of engagement and motivation among students. CAE examined student motivation in its field trials and feasibility study by holding focus groups with all students who took the CLA assessments. Through the focus groups, CAE found that the Performance Tasks themselves engage student interest. Students find the problems to be realistic, fun, and a fair test of skills they think are important (Klein, Kuh, Chun, Hamilton, & Shavelson, 2005). Indeed, students reported (via post-assessment surveys) that they wished they had encountered more assessments like CLA in college and eagerly wanted to know how well they did compared to students in their own institutions and at others data that we supply at the conclusion of each administration(klein, Benjamin, Shavelson, & Bolus, 2007). Not surprisingly, students self-reported effort, time spent on the test, and CLA scores are positively correlated. CAE s research has shown that, for individual students, motivation is a significant predictor of CLA scores (Steedle, 2010). Yet, the underlying nature of these relationships, including direction of causality, is unknown. CAE supplies institutions with data on the amount of time each of their students spent on CLA, and with CLA+, CAE will be introducing student responses to survey questions about their effort on and engagement with the CLA+. Individual schools can also collect additional data on student self-reported motivation through a locally customizable online survey that students will complete alongside the standard CLA+ survey questions (see Administrator Manual for directions). These data empower institutions to evaluate the relationships between time and effort spent on the test, engagement with the assessment, and student scores on their campuses and draw appropriate conclusions from their CLA+ results. Traditionally, CLA has been a low-stakes assessment; however, CLA+ with the introduction of new selectedresponse questions allows for more reliable student-level results than the traditional CLA. This increased reliability in turn provides new opportunities for using CLA+ scores; colleges and graduate schools can use CLA+ scores as additional admissions information from applicants to evaluate the strengths and weaknesses of entering students, and graduating seniors can use their verified scores to provide potential employers with evidence of their workreadiness skills. These new uses for CLA+ scores can be promoted to encourage students to participate in the CLA+, put forth their best effort on the assessment, and take pride in their performance. Nonetheless, developing students typical performance level so that it approaches their maximal performance is a laudable goal. Higher-education institutions should instill in their students the habit of devoting due effort to any task, whether it is taking a course, engaging in volunteer work, or participating in an assessment like CLA+. In other words, producing students who strive to perform well in all endeavors is itself a goal every higher-education institution has or should have. The degree of motivation students bring to CLA+ is inextricably linked to the norms of performance expected by their educational institutions. In that sense, student willingness to perform well on the CLA+ is one indicator of institutional impact. Additionally, CAE urges campuses to think about how they integrate assessment into their institutional fabric so that participation in assessments like CLA+ becomes part of their educational plan for each student. CAE believes that campuses should have mechanisms to provide feedback to students, faculty, programs, and the institutions themselves. Effective and widespread communication about CLA+ will increase support for the important endeavor of institutional assessment while increasing the effort students apply to such assessments. For more information, please read http://cae.org/images/uploads/pdf/test_administration_procedures.pdf. Sources: Klein, S., Benjamin, R., Shavelson, R., & Bolus, R. (2007). The Collegiate Learning Assessment: Facts and fantasies. Evaluation Review, 31(5), 415-439. Klein, S., Kuh, G, Chun, M., Hamilton, L., & Shavelson, R. (2005). An approach to measuring cognitive outcomes across higher-education institutions. Research in Higher Education, 46(3), 251-276. Steedle, J. T. (2010b). Incentives, motivation, and performance on a low-stakes test of college learning. Paper presented at the Annual Meeting of the American Educational Research Association, Denver, CO. 5