PEARL: Program Excellence through Assessment, Research and Learning



Similar documents
GRADUATE CERTIFICATES Procedures and Requirements. Special Education. Draft: 12/17/15

Procedures and Requirements. Special Education. Revised: 8/15/14; Amended 3/27/15 MASTER S DEGREE

COORDINATOR/CONTACT. Dr. Harriet Bessette - hbessett@kennesaw.edu. Department Chairs. Dr. Susan Stockdale Dr. Jim Cope. Dr.

7. Assessment Students will acquire individual and group assessment skills.

Procedures and Requirements. Special Education. Revised: 12/2/14 MASTER S DEGREE

Master's Degree Programs Requirements

California State University, Stanislaus GENERAL EDUCATION: ASSESSMENT CHRONOLOGY

New England Association of Schools & Colleges, Inc. (NEASC), is the nation s oldest regional accrediting association (founded 1885)

Doctoral Handbook. Department of Learning Technologies. Guidelines for Doctoral Students. College of Information University of North Texas

Vermont College of Fine Arts MFA in Writing for Children & Young Adults

The Final Assessment of A Framework to Foster Diversity at Penn State: The Smeal College of Business Administration

Getting Started Guide for Online Campus Students

LL.M. in Individualized Legal Studies

VICE PRESIDENT FOR CAMPUS LIFE AND STUDENT SERVICES

Lewis Clark State College AC-385 Cost Accounting Class Syllabus Fall 2015

SR-EIP. The Summer Research Early Identification Program. The Leadership Alliance National Symposium. Associate Member. Member Institutions

A Guide. to Assessment of Learning Outcomes. for ACEJMC Accreditation

The HLC at MCC Accreditation Overview Assurance Process Timeline Quality Initiative Timeline The Criteria The Evidence & Arguments

Master of Science in BUSINESS PSYCHOLOGY. Understand what s behind the structure, behavior and environment of your organization

The Summer Research Early Identification Program SR-EIP

Information for Educators Page. Acknowledgments...4 About this Curriculum...5 National Science Education Standards...6

Web Address:

CURRICULUM VITAE DR. SUSAN CRUISE. University of South Carolina Lancaster 476 Hubbard Drive Lancaster, SC 29720

Appendix. Consultants Fall 1993-Spring 2002 APPENDIX C

Stephanie E. Dewing Little London Drive Colorado Springs, CO (719)

The University of North Texas at Dallas Policy Manual

John W. Hawthorne. November 2014

Get to the next level. Enroll now.

The University of Louisiana at Monroe

Gallup Leadership Institute

Jennifer Puentes Curriculum Vitae

David F. Sacks. Curriculum Vitae. Dissertation Chair: Dr. Rhonda Douglas Brown M.Ed., Educational Foundations, University of Cincinnati

Share responsibilities in the Forensics Program, including teaching and related activities, as assigned.!

Department of Teaching, Learning, &Teacher Education

Outcomes Assessment and Program Improvement Master of Business Administration

CURRICULUM VITAE DR. SUSAN CRUISE. University of South Carolina Lancaster 476 Hubbard Drive Lancaster, SC 29720

Curriculum Vitae KELLY DANAHER. Department of Psychology Office Phone: (319)

Inventory of Program Effectiveness Indicators

NATIONAL CATHOLIC SCHOOL OF SOCIAL SERVICE Baccalaureate Study in Social Work Goals and Assessment of Student Learning Outcomes

NURSING SCHOLARSHIPS COVER SHEET

DIRECTOR, POPULATION HEALTH SCIENCES INITIATIVE

Southwest Baptist University

Department of Teacher Education

JOMC 279: Advertising and Public Relations Research Methods Fall 2015 Class Time: 8:00am to 9:15am, Tuesdays and Thursdays Room: Carroll Hall 33

Delta College. Advisory Committee Handbook

Westminster College Application for Admission. Graduate Study. Founded 1852 New Wilmington, Pa.

Dean of the College of Computing and Software Engineering

Brown, CV Spring

AMHERST. First-Year Students in the Class of 2018 October 1, 2014 A Snapshot

MSW APPLICATION PROCESS

Dean of the School of Adult Learning North Park University Chicago, IL

School of Nursing Framework to Foster Diversity (2009 Draft)

FACULTY. Michelle Alvarez, MSW, EdD Office: WH 344 Cell: Office phone: Fax:

Master of Jurisprudence

Healthcare Executive. Diversity and Inclusion. Certificate Program

Steps for Getting Started Online

Technical College System of Georgia (TCSG) and Georgia Independent Colleges Association (GICA) Transfer Articulation Agreement

DEAN OF THE COLLEGE OF APPLIED SCIENCES AND ARTS

Alicia Huckstadt, PhD, APRN, FNP BC, GNP BC, FAANP DNP Graduate Director & Professor AH/am

School of Public Health and Health Services Department of Health Services Management and Leadership

Master s in Educational Psychology. School Counseling Track

BUS 373 ORGANIZATIONAL BEHAVIOR Syllabus Fall 2015

Master of Business Administration - Accounting, Kansas State University, Manhattan, KS. May, 1977.

CAHEA Faculty Salary Survey Report

Undergraduate Academic Assessment Plan

Speech-Language Pathology

A Guide for the Development of Purchasing Cooperatives

RN to BSN Completion Option Application for Admission

2013 Ph.D., Education and Human Resource Studies with Emphasis in Construction Management, Colorado State University

Transcription:

PEARL: Program Excellence through Assessment, Research and Learning A collaborative project Colorado State University The College of Agricultural Sciences and Natural Resources The College of Education and Human Sciences The College of Journalism and Mass Communications The Division of Student Affairs The Hixson-Lied College of Fine and Performing Arts The Office of Undergraduate Studies And Developed for PEARL by Jeremy Penn and Jessica Jonson Office of Undergraduate Studies University of Nebraska-Lincoln Updated: Spring, 2008

The University of Nebraska Lincoln does not discriminate based on gender, age, disability, race, color, religion, marital status, veteran s status, national or ethnic origin, or sexual orientation.

Introduction Program Excellence through Assessment, Research, and Learning (PEARL) is an on-line environment that supports and tracks the planning and implementation of assessment processes that programs use to gather evidence about student learning. The intended goal of this process is the use of this evidence in decisions on how to improve our educational programs. The current participating groups include: Colorado State University The College of Agricultural Sciences and Natural Resources The College of Education and Human Sciences The College of Journalism and Mass Communications The Division of Student Affairs The Hixson-Lied College of Fine and Performing Arts The Office of Undergraduate Studies PEARL History The software used by PEARL began development at Florida Atlantic University. Development of the software, called PRISM, continued under the direction of Dr. Kim Bender at Colorado State University. In the summer of 2004 a visitation team made up of members from the College of Education and Human Sciences, the College of Agricultural Sciences and Natural Resources and the Office of the Dean of Undergraduate Studies visited Colorado State University and received a demonstration of PRISM. In 2005 Colorado State University and the University of Nebraska-Lincoln reached a partnership agreement. PRISM was brought to UNL and was named PEARL. In the fall of 2005 undergraduate academic programs from CEHS and CASNR began using PEARL in the assessment of student learning outcomes. The College of Journalism and Mass Communications and the Hixson-Lied College of Fine and Performing Arts joined PEARL in 2006 and will complete their first full PEARL cycle in the fall of 2008. The Division of Student Affairs joined PEARL in 2007 and will complete their first PEARL cycle in 2009. PEARL, 3

PEARL Participants PEARL participants are organized into 5 categories as shown in Figure 1. Figure 1: PEARL Roles and Relationships The figure was intentionally drawn horizontally instead of vertically to represent PEARL s grass-roots approach instead of a top-down approach. The arrows at the top of the figure represent the relationship between each of the different roles. Moving from left to right on the diagram, the relationship requires communicating the vision for PEARL, coaching in the techniques of PEARL, leading the direction for PEARL, organizing meetings and events, and assisting in completing the work of PEARL. Moving from right to left on the diagram the relationship requires communicating your work in PEARL, advising ways to improve PEARL, sharing your work in PEARL and assisting in completing the work of PEARL. PEARL Steering Committee: Individuals from this group communicate with Program Leaders and Peer Reviewers and provide the overall vision and leadership for PEARL. Primarily use the system to track programs progress through the assessment process. Peer Reviewers: The Peer Reviewers work closely with both the PEARL Steering Committee and the PEARL Program Leaders. The Peer Reviewers provide feedback and coaching to the Program Leaders on their PEARL plans and also provide suggestions to the Steering Committee on ways to improve the PEARL process. Program Leaders: Program Leaders provide the program-level leadership needed to ensure the forward progress of the PEARL process within their program. PEARL, 4

Program faculty or staff: Program faculty and staff have little direct interaction with the PEARL software but are still active by supporting the Program Leaders in identifying student learning outcomes, selecting assessment measures, gathering assessment evidence, and making program improvements. The list of persons at UNL who have participated in PEARL is shown below. Steering Committee Members Peer Reviewers Program Leaders Bob Fought (HLCFPA, 2006 - ) Susan Fritz (CASNR, 2004 2006) Frauke Hachtmann (CJMC, 2008 - ) Laura Hardin (CASNR, 2006-2007) Fayrene Hamouz (CEHS, 2004 2005) Jessica Jonson (OUS, 2004 - ) Linda Major (DSA, 2007 - ) John Markwell (CASNR, 2007 - ) Nancy Mitchell (CJMC, 2006-2008) Jeremy Penn (OUS, 2004 - ) Jim Walter (CEHS, 2004 - ) Stan Brown (HLCFPA) Brent Cejda (CEHS) David Fowler (CEHS) Shelley Fuller (HLCFPA) Dann Husmann (CASNR) David Jackson (CASNR) Michael James (CEHS) Julie Johnson (CEHS) Steve Jones (CASNR) Phyllis Larsen (CJMC) Luis Peon-Casanova (CJMC) Marilyn Scheffler (CEHS) Madhavan Soundararajan (CASNR) Betty Walter-Shea (CASNR) Dan Walters (CASNR) Curt Weller (CASNR) Bob Woody (HLCFPA) Linda Young (CEHS) Michael Zeece (CASNR) John Barbuto (CASNR) Lloyd Bell (CASNR) Charlyne Berens (CJMC) Richard Bischoff (CEHS) Dennis Brink (CASNR) Trudy Burge (CJMC) Tim Carr (CEHS) Susan Churchill (CEHS) Pat Crews (CEHS) Rochelle Dalla (CEHS) Edward Daly (CEHS) Stephen Danielson (CASNR) Jason Ellis (CASNR) Rich Endacott (HLCFPA) Ed Forde (HLCFPA) John Foster (CASNR) Rhonda Fuelberth (HLCPA) David Gosselin (CASNR) Frauke Hachtmann (CJMC) Fayrene Hamouz (CEHS) Tiffany Heng-Moss (CASNR) David Jackson (CASNR) Julie Johnson (CEHS) John Lammel (CEHS) Bill Latta (CEHS) Margaret Latta (CEHS) John Markwell (CASNR) Dennis McCallister (CASNR) Rodney Moxley (CASNR) James Partridge (CASNR) Helen Raikes (CEHS) Terry Riordan (CASNR) Jeff Rudy (CEHS) Walter Schacht (CASNR) Marilyn Scheffler (CEHS) Rosalee Swartz (CASNR) Curtis Weller (CASNR) Linda Young (CEHS) PEARL, 5

Level of Assessment The relationship between assessment at the course, program, and institutional level is shown in Figure 2. Who is assessing? What is assessed? For what purpose? Course assessment Course instructor Individual student product Student learning and grading Program / sequence assessment Program / Sequence faculty team Collection or sample of student products from across the program / sequence Program / Sequence improvement College or institution assessment Faculty and administration Summary of program / sequence activity and institutional measures Institutional improvement and accountability Figure 2: Course vs. Program vs. Institutional Assessment. Each row in Figure 2 represents a different level of assessment assessment within an individual course, within a program, or within an institution. Each column asks an important assessment question Who is assessing? What is assessed? For what purpose? In course assessment, a single instructor examines a student s paper or product for the purposes of student grading and improving student learning. In program or sequence assessment, a group of faculty members look across a program of study by collecting a sample of student products for the purpose of improving the program or sequence. In institutional assessment, a group of faculty members and administrators examine summaries of program assessment and gather institutional-level data for the purposes of institutional improvement and accountability. PEARL, 6

PEARL Cycle An overview of the PEARL cycle is shown in Figure 3. Figure 3: The PEARL cycle. In the first step, Identify student learning outcomes, PEARL Program Leaders work together with the faculty or staff members in their program, and department chairs and accreditators as needed, to develop and identify the learning outcomes for their program and assessment measures that will be used to provide evidence on the achievement of those outcomes. Programs are asked to select about 3 student learning or development outcomes for each PEARL cycle. The Program Leader enters this information, called the PEARL Plan, into the PEARL software. PEARL, 7

Next, the PEARL Peer Reviewers examine the PEARL Plan and provide coaching and feedback on ways the plan could be improved using the PEARL Rubric. The PEARL Rubric was developed during the PEARL pilot here at UNL in 2005. The PEARL Rubric also undergoes periodic revision and updating. During the fall and spring semesters the program implements the plan and collects the assessment data. Programs do not need to wait for approval before implementing their plans, as the Peer Reviewers provide coaching and suggestions for improvement, not approval. At the end of the spring semester, programs finish their data collection and begin analyzing, interpreting, and sharing those data to identify potential program improvements. Programs then develop their PEARL Results that describe their interpretations of the assessment evidence, how that evidence was shared, and changes and improvements made to the program. The Program Leaders enter this information into the PEARL software. The Peer Reviewers review the PEARL Results and provide coaching and feedback. The PEARL cycle starts over again with Program Leaders and Program faculty or staff members identifying learning outcomes and assessment measures (these can be the same as used in previous cycles if desired) and entering this information into the PEARL software. The PEARL timeline is flexible and may be organized differently (e.g., a two-year cycle with reporting in May) in participating units. PEARL, 8

PEARL Conference Presentations and Papers Bender, K., & Jonson, J. L. (May, 2007). Assessing student learning outcomes using assessment management software: Benefits, challenges, and lessons learned. Paper presented at the 2007 USDA CSREES North Central Academic Program Section, Lincoln, NE. Hosted by the College of Agricultural Sciences and Natural Resources, UNL. Jonson, J. L., Bender, K., Siller, T., & Mitchell, N. (April, 2007). Use and impact of a quality enhancement system: A tale of two universities. Paper presented at the 112th Annual Meeting of the North Central Association of Colleges and Schools, Chicago, IL. http://www.unl.edu/ous/faculty_resources/assessment/hlc_nca_2007_(5).pdf Jonson, J. L., Bender, K., Siller, T., & Walter, J. (June, 2007). Use and impact of a quality enhancement system: A tale of two universities. Paper presented at the 47th Annual Association for Institutional Research Forum, Kansas City, MO. Penn, J. D., Bender, K., & Mitchell, N. (January, 2008). Using technology to build collective responsibility for improving student learning: Two universities collaborate to develop one organizational learning environment. Paper presented at the 2008 Annual Meeting of the Association of American Colleges and Universities, Washington D.C. http://www.unl.edu/ous/pearl/aacu2008pennbendermitchell.pdf Penn, J. D., Jonson, J. L., & Johnson, J. (April, 2008). Developing shared criteria for peer review of program assessment plans. Paper presented at the 113th Annual Meeting of the North Central Association of Colleges and Schools, Chicago, IL. Penn, J. D., Jonson, J. L., Walter-Shea, E., & Young, L. (November, 2007). Building faculty support for outcomes assessment through faculty development and the implementation of an online assessment management system. Paper presented at the 2007 IUPUI Assessment Institute, Indianapolis, IN. http://www.unl.edu/ous/pearl/pennwalter-sheayoung2007.pdf PEARL, 9

Contact Information PEARL Steering Committee Office of Undergraduate Studies Dr. Jessica Jonson, University-wide assessment coordinator: (402) 472-3899, jjonson2@unl.edu Jeremy Penn, Assessment associate for PEARL: (402) 472-1905, jpenn@unlnotes.unl.edu College of Agricultural Sciences and Natural Resources Dr. John Markwell, Associate Dean: (402) 472-2924, jmarkwell2@unl.edu College of Education and Human Sciences Dr. Jim Walter, Associate Dean: (402) 472-3392, lwalter1@unl.edu College of Journalism and Mass Communications Frauke Hachtmann, Professor: (402) 472-9848, fhachtmann1@unl.edu Hixson-Lied College of Fine and Performing Arts Dr. Bob Fought, Associate Dean: (402) 472-9339, rfought1@unl.edu PEARL, 10