PRINCIPLES OF HIGH QUALITY ASSESSMENT. Allan M. Canonigo

Similar documents
Principles of Data-Driven Instruction

To act as a professional inheritor, critic and interpreter of knowledge or culture when teaching students.

Glossary of Terms Ability Accommodation Adjusted validity/reliability coefficient Alternate forms Analysis of work Assessment Battery Bias

Writing Student Learning Outcomes for an Academic Program

X = T + E. Reliability. Reliability. Classical Test Theory 7/18/2012. Refers to the consistency or stability of scores

METHODOLOGY FOR ASSESSMENT OF NON-FORMAL AND INFORMAL LEARNING ACHIEVEMENTS IN A STUDY PROCESS

CREATING LEARNING OUTCOMES

Tier One: Possess and Exercise Fundamental Knowledge of the Human and Physical Worlds

TAXONOMY OF EDUCATIONAL OBJECTIVES (Excerpts from Linn and Miller Measurement and Assessment in Teaching, 9 th ed)

Assessment, Case Conceptualization, Diagnosis, and Treatment Planning Overview

BODY OF KNOWLEDGE CERTIFIED SIX SIGMA YELLOW BELT

Guided Reading 9 th Edition. informed consent, protection from harm, deception, confidentiality, and anonymity.

BUSINESS STRATEGY SYLLABUS

Asse ssment Fact Sheet: Performance Asse ssment using Competency Assessment Tools

Principles for Fair Student Assessment Practices for Education in Canada

ILLINOIS CERTIFICATION TESTING SYSTEM

2. SUMMER ADVISEMENT AND ORIENTATION PERIODS FOR NEWLY ADMITTED FRESHMEN AND TRANSFER STUDENTS

Radford University TEACHER EDUCATION PROGRAM

MIAMI DADE COMMUNITY COLLEGE MEDICAL CENTER CAMPUS SCHOOL OF NURSING. Fundamentals of Nursing CLINICAL EVALUATION ACHIEVEMENT LEVELS

Learning Goals & Success Criteria

GFMAM Competency Specification for an ISO Asset Management System Auditor/Assessor First Edition, Version 2

JOINT BOARD OF MODERATORS GUIDELINES FOR ACCREDITED BACHELORS DEGREE PROGRAMMES LEADING TO INCORPORATED ENGINEER

Writing Learning Objectives

Assessment Terminology for Gallaudet University. A Glossary of Useful Assessment Terms

THE UNIVERSITY OF HONG KONG FACULTY OF BUSINESS AND ECONOMICS

Advantages and Disadvantages of Various Assessment Methods

Performance Assessment Task Baseball Players Grade 6. Common Core State Standards Math - Content Standards

Facilitator: Dr. Mervin E. Chisholm h Manager/Coordinator, IDU

Writing Goals and Objectives If you re not sure where you are going, you re liable to end up some place else. ~ Robert Mager, 1997

California State University, Stanislaus Doctor of Education (Ed.D.), Educational Leadership Assessment Plan

Writing learning objectives

The National Arts Education Standards: Curriculum Standards <

Evaluating Students in the Classroom Faculty Development Conference February 2007 D. P. Shelton, MSN, RN, CNE, EdD (c)

Table 1A-Business Management A.A.S. Student Learning Results

Certified Six Sigma Yellow Belt

PEER REVIEW HISTORY ARTICLE DETAILS VERSION 1 - REVIEW. Tatyana A Shamliyan. I do not have COI. 30-May-2012

Pre-service Performance Assessment Professional Standards for Teachers: See 603 CMR 7.08

Teacher Education Portfolio Guidelines and Rubric

SigmaRADIUS Leadership Effectiveness Report

Caroline M. Stern, Ph.D.

National Commission for Academic Accreditation & Assessment. Handbook for Quality Assurance and Accreditation in Saudi Arabia PART 1

Exhibit 3.3.f.2. Candidate s Score

Writing Educational Goals and Objectives

Date Self-Assessment Evaluator Assessment

TOWARDS THE PATHWAYS VISION MODEL OF INTRODUCTORY ACCOUNTING. Michael S. Wilson, PhD, CPA, CGMA

Reading Competencies

WHAT IS '''SIGNIFICANT LEARNING"?

STANDARD. Risk Assessment. Supply Chain Risk Management: A Compilation of Best Practices

EAST CENTRAL UNIVERSITY. Assessment of Academic Achievement ANNUAL ASSESSMENT PLAN ACCOUNTING. Master of Science

DATA COLLECTION AND ANALYSIS

Assessment METHODS What are assessment methods? Why is it important to use multiple methods? What are direct and indirect methods of assessment?

AK + ASD Writing Grade Level Expectations For Grades 3-6

Reliability Overview

Rubrics for AP Histories. + Historical Thinking Skills

Present Level statements must: Goals and Objectives Progress Reporting. How Progress will be determined: Goals must be: 12/3/2013

General Approaches to evaluating student learning Qatar University, 2010

Writing Learning Objectives

COLLEGE OF BUSINESS ASSURANCE OF LEARNING GOALS. Undergraduate Programs

Mathematics SL subject outline

Pennsylvania Core Competencies for Instructors Self Assessment Checklist

INTERMEDIATE QUALIFICATION

MASTER SYLLABUS MIST300: INTRODUCTION TO MANAGEMENT INFORMATION SYSTEMS

Chapter 3 Psychometrics: Reliability & Validity

Professionsbachelor i Innovation og Entrepreneurship Bachelor of Innovation and Entrepreneurship

South Texas College Division of Nursing/Allied Health Occupational Therapy Assistant Program Master Syllabus Fall 2006

INTERMEDIATE QUALIFICATION

MASTER S DEGREE IN EUROPEAN STUDIES

Virginia English Standards of Learning Grade 8

Student Assessment That Boosts Learning

Using the PRECEDE- PROCEED Planning Model PRECEDE-PROCEED P R E C E D E. Model for health promotion programming Best known among health scientists

Can t tell: The reader cannot make a determination based on the material provided

ACE Resource Course Development Checklist

O Guinn, Allen, and Semenik (2015), Advertising and Integrated Brand Promotion (7th edition), South-Western College Publishing..

The Personal Learning Insights Profile Research Report

International Baccalaureate Diploma Assessment Policy Holy Heart of Mary High School

DESCRIPTOR OF THE STUDY FIELD OF PUBLIC ADMINISTRATION CHAPTER I GENERAL PROVISIONS

Learning Goals. Task: Identify and provide a rationale for the learning goals/objectives for the unit.

USING THE PRINCIPLES OF ITIL ; SERVICE CATALOGUE. Examination Syllabus V 1.2. October 2009

Submitted for Review and Approval Florida Department of Education 5/1/2012

Technical Report. Overview. Revisions in this Edition. Four-Level Assessment Process

Teacher Development Workshop ACCOUNTING GRADE 11

NAST GUIDELINES REGARDING COMPETENCIES

Educational Goals and Objectives A GUIDE TO DEVELOPING LEARNER BASED INSTRUCTION

Bishop s University School of Education. EDU 102: Philosophy of Education. Fall 2011

Alignment, Depth of Knowledge, & Change Norman L. Webb Wisconsin Center for Education Research

Electronic Engineering Technology Program Exit Examination as an ABET and Self-Assessment Tool

Revised Bloom s Taxonomy

Appendix A: Annotated Table of Activities & Tools

2012/2013 Programme Specification Data. Environmental Science

Overseas Investment in Oil Industry and the Risk Management System

Lessons Learned International Evaluation

CALIFORNIA S TEACHING PERFORMANCE EXPECTATIONS (TPE)

Opposition on written report / thesis

New Jersey Professional Standards for Teachers Alignment with InTASC NJAC 6A:9C-3.3 (effective May 5, 2014)

How To Be A Critical Thinker

Core requirements for teachers of English: knowledge and performance

Content. Software Development. Examination Advice & Criteria. Format - Section A. Section A 8/25/10

Teacher S Web Site Development courses

Phenomenological Research Methods

Transcription:

PRINCIPLES OF HIGH QUALITY ASSESSMENT Allan M. Canonigo http://love4mathed.com

PRINCIPLES OF HIGH QUALITY ASSESSMENT 1. Clarity of learning targets 2. (knowledge, reasoning, skills, products, affects) 3. Appropriateness of Assessment Methods 4. Validity 5. Reliability 6. Fairness 7. Positive Consequences 8. Practicality and Efficiency 9. Ethics

1. CLARITY OF LEARNING TARGETS (knowledge, reasoning, skills, products, affects) Assessment can be made precise, accurate and dependable only if what are to be achieved are clearly stated and feasible. The learning targets, involving knowledge, reasoning, skills, products and effects, need to be stated in behavioral terms which denote something which can be observed through the behavior of the students.

CLARITY OF LEARNING TARGETS (CONT) Cognitive Targets Benjamin Bloom (1954) proposed a hierarchy of educational objectives at the cognitive level. These are: Knowledge acquisition of facts, concepts and theories Comprehension - understanding, involves cognition or awareness of the interrelationships Application transfer of knowledge from one field of study to another of from one concept to another concept in the same discipline Analysis breaking down of a concept or idea into its components and explaining g the concept as a composition of these concepts Synthesis opposite of analysis, entails putting together the components in order to summarize the concept Evaluation and Reasoning valuing and judgment or putting the wor th of a concept or principle.

CLARITY OF LEARNING TARGETS(CONT) Skills, Competencies and Abilities Targets Skills specific activities or tasks that a student can proficiently do Competencies cluster of skills Abilities made up of relate competencies categorized as: i. Cognitive ii. Affective iii. Psychomotor Products, Outputs and Project Targets - tangible and concrete evidence of a student s ability - need to clearly specify the level of workmanship of projects i. expert ii. skilled iii. novice

2. APPROPRIATENESS OF ASSESSMENT METHODS a. Written-Response Instruments Objective tests appropriate for assessing the various levels of hierarchy of educational objectives Essays can test the students grasp of the higher level cognitive skills Checklists list of several characteristics or activities presented to the subjects of a study, where they will analyze and place a mark opposite to the characteristics.

2. APPROPRIATENESS OF ASSESSMENT METHODS b. Product Rating Scales Used to rate products like book reports, maps, charts, diagrams, notebooks, creative endeavors Need to be developed to assess various products over the years c. Performance Tests - Performance checklist Consists of a list of behaviors that make up a certain type of performance Used to determine whether or not an individual behaves in a certain way when asked to complete a particular task

2. APPROPRIATENESS OF ASSESSMENT METHODS d. Oral Questioning appropriate assessment method when the objectives are to: Assess the students stock knowledge and/or Determine the students ability to communicate ideas in coherent verbal sentences. e. Observation and Self Reports Useful supplementary methods when used in conjunction with oral questioning and performance tests

3. PROPERTIES OF ASSESSMENT METHODS Validity Reliability Fairness Positive Consequences Practicality and Efficiency Ethics

3. VALIDITY Something valid is something fair. A valid test is one that measures what it is supposed to measure. Types of Validity Face: What do students think of the test? Construct: Am I testing in the way I taught? Content: Am I testing what I taught? Criterion-related: How does this compare with the existing valid test? Tests can be made more valid by making them more subjective (open items).

MORE ON VALIDITY Validity appropriateness, correctness, meaningfulness and usefulness of the specific conclusions that a teacher reaches regarding the teaching-learning situation. Content validity content and format of the instrument i. Students adequate experience ii. Coverage of sufficient material iii. Reflect the degree of emphasis Face validity outward appearance of the test, the lowest form of test validity Criterion-related validity the test is judge against a specific criterion Construct validity the test is loaded on a construct or factor

RELIABILITY Something reliable is something that works well and that you can trust. A reliable test is a consistent measure of what it is supposed to measure. Questions: Can we trust the results of the test? Would we get the same results if the tests were taken again and scored by a different person? Tests can be made more reliable by making them more objective (controlled items).

Reliability is the extent to which an experiment, test, or any measuring procedure yields the same result on repeated trials.

Equivalency reliability is the extent to which two items measure identical concepts at an identical level of difficulty. Equivalency reliability is determined by relating two sets of test scores to one another to highlight the degree of relationship or association.

Stability reliability (sometimes called test, re-test reliability) is the agreement of measuring instruments over time. To determine stability, a measure or test is repeated on the same subjects at a future date.

Internal consistency is the extent to which tests or procedures assess the same characteristic, skill or quality. It is a measure of the precision between the observers or of the measuring instruments used in a study.

Interrater reliability is the extent to which two or more individuals (coders or raters) agree. Interrater reliability addresses the consistency of the implementation of a rating system.

RELIABILITY CONSISTENCY, DEPENDABILITY, STABILITY WHICH CAN BE ESTIMATED BY Split-half method Calculated using the i. Spearman-Brown prophecy formula ii. Kuder-Richardson KR 20 and KR21 Consistency of test results when the same test is administered at two different time periods i. Test-retest method ii. Correlating the two test results

5. FAIRNESS The concept that assessment should be 'fair' covers a number of aspects. Student Knowledge and learning targets of assessment Opportunity to learn Prerequisite knowledge and skills Avoiding teacher stereotype Avoiding bias in assessment tasks and procedures

6. POSITIVE CONSEQUENCES Learning assessments provide students with effective feedback and potentially improve their motivation and/or self-esteem. Moreover, assessments of learning gives students the tools to assess themselves and understand how to improve. - Positive consequence on students, teachers, parents, and other stakeholders

7. PRACTICALITY AND EFFICIENCY Something practical is something effective in real situations. A practical test is one which can be practically administered. Questions: Will the test take longer to design than apply? Will the test be easy to mark? Tests can be made more practical by making it more objective (more controlled items)

Teacher Familiarity with the Method Time required Complexity of Administration Ease of scoring Ease of Interpretation Cost Teachers should be familiar with the test, - does not require too much time - implementable

RELIABILITY, VALIDITY & PRACTICALITY The problem: The more reliable a test is, the less valid. The more valid a test is, the less reliable. The more practical a test is, (generally) the less valid. The solution: As in everything, we need a balance (in both exams and exam items)

8. ETHICS Informed consent Anonymity and Confidentiality 1. Gathering data 2. Recording Data 3. Reporting Data

ETHICS IN ASSESSMENT RIGHT AND WRONG Conforming to the standards of conduct of a given profession or group Ethical issues that may be raised i. Possible harm to the participants. ii. Confidentiality. iii. Presence of concealment or deception. iv. Temptation to assist students.