Data Analysis, Presentation, and Reporting
|
|
|
- Gilbert Davidson
- 9 years ago
- Views:
Transcription
1 Data Analysis, Presentation, and Reporting 2014 Assessment Leadership Institute Review Learning Outcomes Use Results Learning Opportunities Curriculum Map or an activity matrix Identify learning evidence Interpret Results Collect & Evaluate Evidence Evaluation tool (e.g., rubric) Set standards 1
2 New Topics Learning Outcomes Use Results Learning Opportunities Interpret Results Collect & Evaluate Evidence Evaluation tool (e.g., rubric) Set standards Collect evidence (sampling) Interrater consistency Analyze results Data Collection: Sampling Why sampling: Manageable to collect and evaluate Robust sampling stronger conclusions Considerations What is the tolerable margin of error How many work can faculty handle 2
3 Random Sampling Scenario: Collect 200 student papers from 8 senior writing courses randomly select 50 (25%) to evaluate Cluster Sampling Scenario: Collect all student research papers from several randomly selected research classes Research Class 705 Research Class 716 Research Class 720 Research Class 726 Research Class 738 Research Class 738 3
4 Cluster + Random Sampling Scenario: randomly select several student research papers from randomly selected research classes Research Class 705 Research Class 716 Research Class 720 Research Class 726 Research Class 738 Research Class 738 Next Topic Learning Outcomes Use Results Learning Opportunities Interpret Results Collect & Evaluate Evidence Evaluation tool (e.g., rubric) Set standards Collect evidence (sampling) Interrater consistency Analyze results 4
5 Inter rater consistency Inter rater agreement: how many pairs of raters gave exactly the same score? 70% A B 1 Rater 1 Rater New Topics Learning Outcomes Use Results Learning Opportunities Interpret Results Collect & Evaluate Evidence Evaluation tool (e.g., rubric) Set standards Collect evidence (sampling) Interrater consistency Analyze results 5
6 Learning Outcomes Participants are able to apply a data summerization technique that is appropriate for reporting purpose and audience Agenda Planning a summary and an analysis Basic ways to summarize data Achievement question (from 1 or more data sources) Change over time Difference between groups Summarizing comments/open ended responses Principles of data presentation Reporting language 6
7 Planning a Summary & an Analysis Questions to be answered Audience Standards or Benchmarks Questions to be Answered Gr1 Gr2 Yr1 Yr2 Yr3 Achievement Change over time Difference between groups 7
8 Audience Purpose Internal Improvement External Accountability Standards/Benchmarks 80% of the students will achieve the minimum performance level on all outcomes The average score on the national licensure exam will be above national average 8
9 Basic Ways to Summarize 1. Tallies/Counts 2. Percentages 3. Averages 4. Summary of themes in texts Tallies Outcomes Below Approaching Met Exceeded 1. Appropriate use of sources Well synthesized literature Sound methodologies Appropriate analysis Correct interpretation N=100 students 9
10 Tallies Through the workshop, I learned how to 1. Use guiding questions in data analysis 2. Use descriptive statistics in data summarization 3. Choose data presentation table appropriate for audience 4. Summarize achievement from a single source of data 5. Summarize achievement from multiple sources of data Strongly disagree Somewhat disagree Somewhat agree Strongly agree N=100 students Percentages Outcomes Below Approaching Met Exceeded 1. Appropriate use of sources 20% 10% 50% 20% 2. Well synthesized literature 10% 20% 60% 10% 3. Sound methodologies 15% 5% 55% 25% 4. Appropriate analysis 5% 15% 65% 15% 5. Correct interpretation 5% 5% 50% 40% Total number of papers rated =
11 Percentages Through the workshop, I learned how to 1. Use guiding questions in data analysis 2. Use descriptive statistics in data summarization 3. Choose data presentation table appropriate for audience 4. Summarize achievement from a single source of data 5. Summarize achievement from multiple sources of data Strongly disagree Somewhat disagree Somewhat agree Strongly agree 20% 10% 50% 20% 10% 20% 60% 10% 15% 5% 55% 25% 5% 15% 65% 15% 5% 5% 50% 40% Total number of respondents = 100 Averages for Test Scores Scientific Method (Average = 85%) Item Average % Score Item 1 80% Item 4 90% Item 7 80% Item 10 90% Quantitative Reasoning (Average = 43%) Item Average % Score Item 2 20% Item 3 30% Item 6 80% 11
12 SUMMARIZING ACHIEVEMENT WITH A SINGLE DATA SOURCE (TASK A & B ON HANDOUT) SUMMARIZING ACHIEVEMENT WITH MULTIPLE DATA SOURCES (TASK C ON HANDOUT) 12
13 CHANGE OVER TIME Outcomes Year 1 % met Year 2 % met Year 3 % met 3 year change Appropriate analysis (SLO 4) % Appropriate use of sources (SLO 1) % Well synthesized literature (SLO 2) % Sound methodologies (SLO 3) % 13
14 3 Year Comparison of the Percent of Students Who Met the on Each SLO Percent of Students Analysis Use Sources Literature review Methodologies Year 1 Year 2 Year 3 3 Year Comparison of the Percent of Students Who Met the on Each SLO Analysis 40 Use Sources Literature Review 80 Methodologies
15 DIFFERENCE BETWEEN GROUPS Writing Outcomes Native Hawaiian n = 60 (% met) Non Native Hawaiian n = 120 (% met) Difference* Appropriate analysis (SLO 4) % Appropriate use of sources (SLO 1) % Well synthesized literature (SLO 2) % Sound methodologies (SLO 3) % * Difference is % met in the Native Hawaiian Group minus the % met in the Non Native Hawaiian group 15
16 Percent of Students Who Met the on Each SLO: Comparisons between Native Hawaiian (NH) and Non Native Hawaiian (NON NH) Percent of Students Analysis Use Sources Literature review Methodologies NH NON NH A Brief Review Three guiding questions Achievement Change over time Difference across groups Three basic statistics Tallies/Counts Percentages Averages Considerations Audience Useful for decisionmaking Benchmarks Source Single Multiple 16
17 SUMMARY OF THEMES Text Data Open ended survey questions Focus group records Reflection papers Student feedback minute papers 17
18 Theme Summary Strategies Narrative of trends and patterns Grouped listings Theme and category counts + quotes Narrative of Trends and Patterns Example The most prominent suggestion raised by the participants is to increase the length of the workshop, followed by the suggestion to post the material online. A few participants mentioned the following... 18
19 Narrative of Trends and Patterns Useful phrases to use in a report: The greatest strength of the department recognized by the respondents is... XXX is another common theme raised by the students. The main issues mentioned are... The most prevalent theme/factors are... To a lesser extent, X and Y are mentioned. Narrative of Trends and Patterns Analysis Strategies Quick read throughs Random sampling to make it manageable Find a peer to do it too member check Thematic analysis Refer to the resource list 19
20 Group Listings Example What was the one thing you learned in this workshop that you ll find most useful? Rubrics (13 comments) Characteristics and advantages of different types of rubrics Descriptive rubrics seemed useful Examples of rubrics.... Multiple Choice (9 comments) Creating multiple choice questions The criteria for writing good MC tests Tips for writing multiple choice.... Self Reflection (5 comments) Reflective writing I think these will be most useful. The self reflection info will really work for my students..... General and Miscellaneous (3 comments) Great tips and tools How to process and assess the assessment tools we use That assessment encompasses test design and grading 20
21 Grouped Listings Considerations Use when statements fall into a few discrete categories. Mind the unit of analysis: comments or people? Still need to interpret: Participants mentioned rubrics most often as the most useful thing they learned at this particular workshop, with multiple choice tests coming in second. Theme/Category Counts + Quotes Table X. Most Useful Workshop Elements Categories Count of Comments Quotes Rubrics 13 Characteristics and advantages of different types of rubrics Creating rubrics is an excellent collaborative exercise by which department colleagues establish common goals Multiple Choice 9 The criteria for writing good MC tests Creating multiple choice questions Self Reflection 5 Reflective writing I think these will be most useful. General and Miscellaneous 3 Mahalo for the coffee and snacks 21
22 Theme Summary Strategies Narrative of trends and patterns Grouped listings Theme and category counts + quotes Review Considerations Techniques Tallies Percentages Averages Summary of themes 22
23 Source Chapter 16 Summarizing and Analyzing Assessment Results in Suskie, L. (2009). Assessing student learning: A common sense guide. (2 nd ed.) San Francisco, CA: Jossey Bass. Tips in Presentation Sort the results in a meaningful order Present only the information necessary for the intended audience Be concise. Consider putting the detailed raw summaries in the appendix Avoid decimals in the percentages Calculate valid percentages: use question completers as denominator Consider visuals: graphs/charts (ink to information ratio) Using lists/tables to organize themes 23
24 Reporting Elements Target SLOs Sampling number, technique, time frame Evaluation process Summarization of results Intended use of results Sample Reports Eight faculty members scored 40 randomly selected student research papers from fall 2013 senior writing courses to evaluate student achievement on the written communication outcome. 80% of the papers met or exceeded expectations. We met the achievement benchmark of 75%. The faculty celebrated the success and published the achievement on the website. Faculty also scheduled meeting to discuss improving writing assignment to strengthen our success. 24
25 Sample Data Report Multiple Years The curriculum committee scored between 30 to 50 student randomly selected papers each year from 2011 to 2013 to evaluate achievement on the written communication outcome. At least 80% of the students met or exceeded the outcome each year, exceeding the benchmark of 75%. There is a 6% increase from 80% in 2011 to 85% in The program has been making small but steady improvements over the three year period. The faculty used the results in the following ways Questions? 25
26 Assessment Leadership Institute 2014 Hawai i Hall 107 Assessment Office, University of Hawai i at Mānoa Data Analysis, Presentation, and Reporting Group Discussion Task A: Compare the processed summaries. (1) Identify the differences between the processed summaries and the raw summary. (2) For each processed summary, name an audience and purpose that is appropriate for that summary. (3) Which summary would you use for your program? Why? Raw summary Outcomes Below Approaching Met Exceeded 1. Appropriate use of sources Well synthesized literature Sound methodologies Appropriate analysis Correct interpretation Processed summary 1: With details and consistent benchmarks across outcomes Outcomes Total % Below Approaching Met Exceeded Met/Exceeded s Correct interpretation (SLO 5) 5% 5% 50% 40% 90% Sound methodologies (SLO 3) 15% 5% 55% 25% 80% Appropriate analysis (SLO 4) 5% 15% 65% 15% 80% Appropriate use of sources (SLO 1) 20% 10% 50% 20% 70% Well synthesized literature (SLO 2) 10% 20% 60% 10% 70% N=100 students. Benchmark: 80% of the students Met or Exceeded expectations. Processed summary 2: Concise with consistent benchmarks across outcomes Outcomes Benchmark Total % Met/Exceeded s met? Correct interpretation (SLO 5) 90% Yes Sound methodologies (SLO 3) 80% Yes Appropriate analysis (SLO 4) 80% Yes Appropriate use of sources (SLO 1) 70% No Well synthesized literature (SLO 2) 70% No N=100 students. Benchmark: 80% of the students Met or Exceeded expectations. Processed summary 3: Concise with inconsistent benchmarks across outcomes Outcomes Total % Met/Exceeded s Benchmark Difference Correct interpretation (SLO 5) 90% 80% + 10% Sound methodologies (SLO 3) 80% 80% 0% Appropriate analysis (SLO 4) 80% 80% 0% Well synthesized literature (SLO 2) 70% 80% - 10% Appropriate use of sources (SLO 1) 70% 90% - 20% N=100 students. Page 1 of 3
27 Group Discussion Task B: Compare the processed summaries. (1) Identify the differences between the processed summaries and the raw summary. (2) For each processed summary, name an audience and purpose that is appropriate for that summary. (3) Which summary would you use for your program? Why? Raw Summary Through the workshop, I learned how to Strongly disagree Somewhat disagree Somewhat agree Strongly agree 1. Use guiding questions in data analysis Use descriptive statistics in data summarization Choose a data presentation table appropriate for audience Summarize achievement from a single source of data Summarize achievement from multiple sources of data Processed Summary 1 Through the workshop, I learned how to Summarize achievement from multiple sources of data (I5) Choose a data presentation table appropriate for audience (I3) Summarize achievement from a single source of data (I4) Use descriptive statistics in data summarization (I2) Use guiding questions in data analysis (I1) Number of Respondents Strongly disagree Somewhat disagree Somewhat agree Strongly agree Total % agreement 100 5% 5% 50% 40% 90% % 5% 55% 25% 80% 100 5% 15% 65% 15% 80% % 19% 57% 10% 67% % 9% 45% 18% 64% Processed Summary 2 Techniques Participants Learned Number of respondents Total % agreement Benchmark met? Summarize achievement from multiple sources of data (I5) % Yes Choose data presentation table appropriate for audience (I3) % Yes Summarize achievement from a single source of data (I4) % Yes Use descriptive statistics in data summarization (I2) % No Use guiding questions in data analysis (I1) % No Benchmark: 80% of the participants learned each technique. Processed Summary 3 Number of Techniques Learned % of Participants 3 or more 40% 2 50% 1 10% Total 100% Notes: (1) Percent of participants is calculated based on 100 participants who answered every question. (2) Benchmark is that 80% of the participants learned at least 2 techniques. Page 2 of 3 Assessment Leadership Institute 2014
28 Group Discussion Task C: Compare the processed summaries. (1) Identify the differences between the processed summaries and the raw summary. (2) For each processed summary, name an audience and purpose that is appropriate for that summary. (3) Which summary would you use for your program? Why? Raw Summary Evidence 1: Essay Evidence 2: Presentation Evidence 3: Test scores Average % met SLOs Total # % Met Total # % Met Total # % Met expectations SLO % 30 33% % 52% SLO % % 78% SLO % 30 40% % 60% SLO % 30 53% % 70% Processed Summary 1 Evidence 1: Essay n = 45 to 50 Evidence 2: Presentation n = 30 Evidence 3: Test Scores n = 100 SLOs SLO 2 80% Not applicable 75% 78% SLO 4 76% 53% 80% 70% SLO 3 69% 40% 72% 60% SLO 1 64% 33% 60% 52% * The average % met expectations is an unweighted average. Processed Summary 2 SLOs Average % met expectations Benchmark Benchmark met? SLO 2 78% 75% Yes SLO 4 70% 70% Yes SLO 3 60% 60% Yes SLO 1 52% 60% No Average % met expectations* Resource list for thematic analysis: Analysis of Open Ended Survey Responses Where to start? Assessment Office workshop PowerPoint available online at Aronson, J. (1994). A pragmatic view of thematic analysis. Qualitative Report, 2(1). From Krueger, R. A. (1998). Analyzing and reporting focus group results (Focus Group Kit, Vol. 6). Thousand Oaks, CA: Sage. Patton, M. Q. (2002). Qualitative research and evaluation methods (3 rd Ed.). Thousand Oaks, CA: Sage. Silverman, D. J. (2001). Interpreting qualitative data: Methods for analyzing talk, text and interaction. Thousand Oaks, CA: Sage. Page 3 of 3 Assessment Leadership Institute 2014
Name of the Undergraduate Degree Program
Name of the Undergraduate Degree Program Bachelor of Business Administration - Human Resource Management Mission Statement The Unit s mission is to create enduring educational value for our students, our
Technology Tools to Collect and Analyze Data. Session Outcomes
Technology Tools to Collect and Analyze Data Session Outcomes Use at least one data collection tool Use best data presentation strategies Open to new technologies but keep a healthy skepticism Experiment
University of Northern Iowa College of Business Administration Master of Business Administration Learning Assurance Program Last updated April 2009
University of Northern Iowa College of Business Administration Master of Business Administration Learning Assurance Program Last updated April 2009 1. Philosophy and Program Goals The MBA program s Learning
From Learning Outcome to Assessment: Measuring for Success
From Learning Outcome to Assessment: Measuring for Success Jean Downs Director of Assessment Del Mar College Corpus Christi, TX Formerly Trinidad State Junior College Asako Stone Psychology Instructor/Assessment
Ways to Assess Student Learning During Class
It s important to have evidence of student learning during class as well as through assignments, and exams. These short student engagement activities will provide this evidence. Many take 5 minutes or
Name of the Undergraduate Degree Program
Name of the Undergraduate Degree Program Bachelor of Accounting Mission Statement This program provides students with the knowledge and skills necessary to enter the fields of public, private sector, corporate
How To Improve Your Knowledge Of Psychology
I. Introduction History Assessment Report Applied Psychology Program 2011-2012 The Department of Humanities and Social Sciences added the Bachelor of Science in Applied Psychology in 1997. The program
Analyzing and interpreting data Evaluation resources from Wilder Research
Wilder Research Analyzing and interpreting data Evaluation resources from Wilder Research Once data are collected, the next step is to analyze the data. A plan for analyzing your data should be developed
Assessing Quantitative Reasoning in GE (Owens, Ladwig, and Mills)
Assessing Quantitative Reasoning in GE (Owens, Ladwig, and Mills) Introduction Many students at CSU, Chico, receive much of their college-level mathematics education from the one MATH course they complete
SJSU Annual Program Assessment Form Academic Year 2014 2015
SJSU Annual Program Assessment Form Academic Year 2014 2015 Department: Political Science Program: BA College: Social Sciences Website: http://www.sjsu.edu/polisci/ X_ Check here if your website addresses
Middle Grades Action Kit How To Use the Survey Tools!
How To Use the Survey Tools Get the most out of the surveys We have prepared two surveys one for principals and one for teachers that can support your district- or school-level conversations about improving
1 J (Gr 6): Summarize and describe distributions.
MAT.07.PT.4.TRVLT.A.299 Sample Item ID: MAT.07.PT.4.TRVLT.A.299 Title: Travel Time to Work (TRVLT) Grade: 07 Primary Claim: Claim 4: Modeling and Data Analysis Students can analyze complex, real-world
Matrix of Graduate Program - MBA Student Learning Outcomes for Public Disclosure Identify Each Intended Outcome
Matrix of Graduate Program - MBA Student Learning s for Public Disclosure Identify Each Intended SLO 1 Demonstrate understanding of sport management principles and fundamentals Faculty chair evaluation
Evaluating Students in the Classroom. 2007 Faculty Development Conference February 2007 D. P. Shelton, MSN, RN, CNE, EdD (c)
Evaluating Students in the Classroom 2007 Faculty Development Conference February 2007 D. P. Shelton, MSN, RN, CNE, EdD (c) Objectives Describe assessment strategies for use in the classroom. Explore the
AC 2012-5061: EXPLORING THE DIVERSITY IN FACULTY CAREERS: FORMATIVE AND SUMMATIVE ASSESSMENT IN A PREPARING FU- TURE FACULTY COURSE
AC 2012-5061: EXPLORING THE DIVERSITY IN FACULTY CAREERS: FORMATIVE AND SUMMATIVE ASSESSMENT IN A PREPARING FU- TURE FACULTY COURSE Ms. Cyndi Lynch, Purdue University, West Lafayette Cyndi Lynch is the
Dr. Ryan McLawhon Texas A&M University
Dr. Ryan McLawhon Texas A&M University Introduction to Assessment Components of an Assessment Plan Mission Outcomes Measures Achievement Targets Question and Answer Session NCAAA Standard 4 (&5, etc.)
UNLV Department of Curriculum and Instruction Masters in Education Degree with an emphasis in Science Education Culminating Experience
UNLV Department of Curriculum and Instruction Masters in Education Degree with an emphasis in Science Education Culminating Experience The culminating experience for the Masters of Education degree with
New Mexico 3-Tiered Licensure System
New Mexico 3-Tiered Licensure System Requirements & Guidelines for the Preparation of the New Mexico Professional Development Dossier (PDD) for Teachers Prepared by the New Mexico Public Education Department
Student Learning Outcomes Check Sheet. Responsible Program Coordinator/Chair completing this form: Brent Donham
Student Learning Outcomes Check Sheet (The BS Technology Management program was revised and re-implemented in 2010. There have been no graduates under the new degree plan to date to assess. The program
The Focus Group Interview and Other Kinds of Group Activities
The Focus Group Interview and Other Kinds of Group Activities Why being clear about what is and what is not a focus group interview important? With the rise in popularity it*s become in vogue to call many
Assessment of Student Learning Outcomes Annual Report (Due May 20 to [email protected] and [email protected] ).
of Student Learning Annual Report (Due May 20 to [email protected] and [email protected] ). /Department: Date: Mission learning outcomes (also included in the assessment plan). Please
Human Resource Management Political Science (POLS) 543 Spring 2013 Course Meets: Tuesday and Thursday 11:00-12:15 p.m. Faner 3075
Human Resource Management Political Science (POLS) 543 Spring 2013 Course Meets: Tuesday and Thursday 11:00-12:15 p.m. Faner 3075 Southern Illinois University Carbondale Department of Political Science
Oregon Institute of Technology Bachelor of Science Degree in Dental Hygiene Degree Completion Outreach Program
Oregon Institute of Technology Bachelor of Science Degree in Dental Hygiene Degree Completion Outreach Program Assessment Report 2008-2009 1 I. Introduction The OIT Dental Hygiene program began in 1970
Assessing General Education Capstone Courses: An In-Depth Look at a Nationally Recognized Capstone Assessment Model
Assessing General Education Capstone Courses: An In-Depth Look at a Nationally Recognized Capstone Assessment Model By Seanna Kerrigan, capstone program director, University Studies, and Sukhwant Jhaj,
Measuring the response of students to assessment: the Assessment Experience Questionnaire
11 th Improving Student Learning Symposium, 2003 Measuring the response of students to assessment: the Assessment Experience Questionnaire Graham Gibbs and Claire Simpson, Open University Abstract A review
California State University, Stanislaus Doctor of Education (Ed.D.), Educational Leadership Assessment Plan
California State University, Stanislaus Doctor of Education (Ed.D.), Educational Leadership Assessment Plan (excerpt of the WASC Substantive Change Proposal submitted to WASC August 25, 2007) A. Annual
NORTH CAROLINA AGRICULTURAL AND TECHNICAL STATE UNIVERSITY
NORTH CAROLINA AGRICULTURAL AND TECHNICAL STATE UNIVERSITY Institutional Effectiveness Assessment and Improvement Report Department of Sociology and Social Work Bachelor of Social Work Degree (BSW) The
Undergraduate Academic Assessment Plan 2013-14
Undergraduate Academic Assessment Plan 2013-14 B.F.A. in Graphic Design School of Art + Art History College of Fine Arts Associate Dean Margaret S. Mertz, Ph.D. [email protected] 1 Table of Contents
Course Description: Course Delivery Method:
ED 573: Leadership and Mentoring in Education Instructor: Judi Jenkins, Ed.D. Cell: 870-582-1463 Home: 870-584-3227 Email: [email protected] Conferences by appointment Semester Credit Hours: 3 Shrs. Course
BASIC TECHNIQUES IN USING EXCEL TO ANALYZE ASSESSMENT DATA
1 BASIC TECHNIQUES IN USING EXCEL TO ANALYZE ASSESSMENT DATA University of Hawai i at Mānoa 11/15/12 2 Mission: Improve Student Learning Through Program Assessment 1 3 Workshop outcomes By the end of this
For participants/students in MPH, other health education programs, researchers and other interested health professionals.
QUALITATIVE RESEARCH METHDOLOGY Module Outline Code: QRM ( ) Duration: 1 week The module will comprise a total of thirty hours teaching. Credits: 3 Description: The Qualitative research method module is
MBA Dissertation Guidelines
Faculty of Economics, Management And Accountancy University of Malta MBA Dissertation Guidelines As part of the degree formation you are expected to present a dissertation project. This booklet contains
Collecting and Analyzing Evidence of Student Learning at the Course- and Program-Level
Collecting and Analyzing Evidence of Student Learning at the Course- and Program-Level Dr. Jennifer E. Roberts Director of Academic Assessment Office of Institutional Research, Planning, and Assessment
Analyzing Qualitative Data For Assessment October 15, 2014
Analyzing Qualitative Data For Assessment October 15, 2014 Jeremy Penn, Ph.D. Director Resources Data Analysis, Chapter 6 in Schuh book (pages 158-168) Creswell (2002) or (1998) you can borrow from me
Nottingham Trent University
Nottingham Trent University Guide for Undergraduate Students How can I use my grades to track my progress towards my final degree result? OR So far I ve received several Mid 2.1s, a couple of High 2.2s
Instructional Design Techniques for Creating Effective e-learning
www.elearningguild.com Instructional Design Techniques for Creating Effective e-learning Sighle Brackman, Trivantis 608 Instructional Design Techniques For Creating Effective Elearning Presentation Purpose
Collecting and Using Student Feedback
Collecting and Using Student Feedback CETL Center for Excellence in Teaching and Learning October, 2009 Adapted from Collecting and Using Mid-semester Feedback Teaching Support Services by Jeanette McDonald
School of Social Work Assessment Plan. BSW and MSW Programs
School of Social Work Assessment Plan BSW and MSW Programs The Illinois State University School of Social Work began re-developing the assessment plan for the BSW and MSW programs in 2009 as a team approach.
Inter-Rater Agreement in Analysis of Open-Ended Responses: Lessons from a Mixed Methods Study of Principals 1
Inter-Rater Agreement in Analysis of Open-Ended Responses: Lessons from a Mixed Methods Study of Principals 1 Will J. Jordan and Stephanie R. Miller Temple University Objectives The use of open-ended survey
USAGE OF NVIVO SOFTWARE FOR QUALITATIVE DATA ANALYSIS
USAGE OF NVIVO SOFTWARE FOR QUALITATIVE DATA ANALYSIS Muhammad Azeem Assessment Expert, PEAS University of Education, Lahore PAKISTAN [email protected] Naseer Ahmad Salfi Doctoral Research Fellow
Course Facilitator. Course Description
Claremont Graduate University Master of Public Health Program CGH 307: Public Health Capstone Academic Year 2014 - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Evidence-Based Claims and Belief
Evidence-Based Claims and Belief Andrew P. Martin University of Colorado Copyright 2015 Contents Practical Overview 2 Pedagogical Overview 3 Learning Goals 4 Instructor Preparation Prior to Class 5 Description
Analyzing Assessment Data
Analyzing Assessment Data What does the assessment data mean? The assessment results need to be analyzed to learn whether or not the criteria on the student learning outcomes were met. To give meaning
Assessment METHODS What are assessment methods? Why is it important to use multiple methods? What are direct and indirect methods of assessment?
Assessment METHODS What are assessment methods? Assessment methods are the strategies, techniques, tools and instruments for collecting information to determine the extent to which students demonstrate
University Mission School Mission Department Mission Degree Program Mission
Degree Program Student Learning Report (rev. 7/14) Fall 2013 Spring 2014 The Department of Applied Technology in the School of Business & Technology Business Information Technology, B.S. Effectively assessing
Assessing Online Asynchronous Discussion in Online Courses: An Empirical Study
Assessing Online Asynchronous Discussion in Online Courses: An Empirical Study Shijuan Liu Department of Instructional Systems Technology Indiana University Bloomington, Indiana, USA [email protected]
Program Quality Assessment. William Wiener [email protected]
Program Quality Assessment William Wiener [email protected] Marquette University Medium Sized Private Catholic University 11,500 students 3549 Graduate and Professional Students 39 Master s
Computer Programming Richard Hoagland, Rand Riness 2009-11 Assessment Report. Criteria
Outcome 1: Student learning Students will demonstrate comprehensive skills in planning, creating, and debugging programs CIS 242 (): not taught over this period CIS 266 (over both years): B CIS 284 (Winter
Financial Literacy: Personal Finance Basics
Media Type: DVD Duration: 34 minutes Goal: To organize and plan personal finances and use a budget to manage cash flow. Description: Financial Literacy: Personal Finance Basics provides the student with
THE SCHOOL BOARD OF ST. LUCIE COUNTY, FLORIDA TEACHER PERFORMANCE APPRAISAL SYSTEM, 2014-2015 TABLE OF CONTENTS
, 2014-2015 TABLE OF CONTENTS Purpose and Key Components... 1 1. Core of Effective Practices... 1 2. Student Growth... 2 3. Evaluation Rating Criteria... 13 4. Teacher and Principal Involvement... 14 5.
Advancing Professional Excellence Guide Table of Contents
1 Advancing Professional Excellence Guide Table of Contents APEX Values and Beliefs History and Development of APEX Purpose of APEX Components of APEX The APEX Process APEX Process Timeline Training/Orientation
LAGUARDIA COMMUNITY COLLEGE CITY UNIVERSITY OF NEW YORK DEPARTMENT OF MATHEMATICS, ENGINEERING, AND COMPUTER SCIENCE
LAGUARDIA COMMUNITY COLLEGE CITY UNIVERSITY OF NEW YORK DEPARTMENT OF MATHEMATICS, ENGINEERING, AND COMPUTER SCIENCE MAT 119 STATISTICS AND ELEMENTARY ALGEBRA 5 Lecture Hours, 2 Lab Hours, 3 Credits Pre-
Exam Preparation and Clinical Decision-making in the Dawson Nursing Program
Nursing-WID Project Report June 2015 Prepared by Robin Simmons and Michelle Maguigad Exam Preparation and Clinical Decision-making in the Dawson Nursing Program A. Introduction Graduates of all nursing
Syllabus EDRS 812: Qualitative Methods in Educational Research George Mason University Spring 2004. Course Goals
Syllabus EDRS 812: Qualitative Methods in Educational Research George Mason University Spring 2004 Instructor: Jenny Gordon Office: Robinson A 322 Office hours: By appointment Phone: 703-993-3794 Email:
Running head: FROM IN-PERSON TO ONLINE 1. From In-Person to Online : Designing a Professional Development Experience for Teachers. Jennifer N.
Running head: FROM IN-PERSON TO ONLINE 1 From In-Person to Online : Designing a Professional Development Experience for Teachers Jennifer N. Pic Project Learning Tree, American Forest Foundation FROM IN-PERSON
D R A F T. Faculty Senate Ad Hoc Committee on Quality in Online Learning.
Faculty Senate Ad Hoc Committee on Quality in Online Learning. The Faculty Senate Ad Hoc Committee on the Future Direction of Quality Education is charged with: Defining quality in online/distance education
Assurance of Learning Assessment Process
Assurance of Learning Assessment Process (AACSB Assurance of Learning Standards: An Interpretation, Accessed 12/01/2011, )
Program Learning Outcomes Data Analysis/Reporting Workshop. Center for Teaching & Learning AUB February 10, 2011 Facilitator K. El Hassan, PhD.
Program Learning Outcomes Data Analysis/Reporting Workshop Center for Teaching & Learning AUB February 10, 2011 Facilitator K. El Hassan, PhD. Workshop Outline Introduction Recap Scoring and Analysis Tips
Report on Moravian MBA Program Learning Objectives Related to Communication Behaviors of Students. Data Sources and Instrument Information
May 2012 William A. Kleintop, Ph.D., Associate Dean of Business and Management Programs ACBSP Related Standards: #3 Student and Stakeholder Focus #4 Measurement and Analysis of Student Learning and Performance
Dean: Erin Vines, Faculty Member: Mary Gumlia 12/14/2009
SOLANO COMMUNITY COLLEGE PROGRAM REVIEW COUNSELING COURSES Dean: Erin Vines, Faculty Member: Mary Gumlia 12/14/2009 Introduction The mission of the Solano College Counseling Division is to provide services,
Writing a Research Paper Writing a Rough Draft. Lesson Summary
Writing a Research Paper Writing a Rough Draft QuickView Topic: Projects Writing Skills Grade Levels: 9 12 Lexile Range: 1070 1140 Focus Question: From the information you gathered from all your different
ASSESSMENT GLOSSARY OF TERMS
ASSESSMENT GLOSSARY OF TERMS It is important that faculty across general education areas agree on the meaning of terms that will be used in assessing courses and programs. Selected terms are defined below.
Ph.D. in School Psychology Academic Assessment Plan 2013-14
Office of the Provost Ph.D. in School Psychology Academic Assessment Plan 2013-14 College of Education John Kranzler [email protected] University of Florida Institutional Assessment Continuous Quality
Use of rubrics for assessment and student learning
Use of rubrics for assessment and student learning Zsuzsanna Szabo, Ph. D. Educational Psychology Zsuzsanna Szabo, Ph. D. Rubrics 1 Course Learning & Assessment Learning experience Teaching Objectives
MPA Program Assessment Report Summer 2015
MPA Program Assessment Report Summer 2015 Introduction: This was the second full year for doing learning outcomes assessment based on the 2009 NASPAA accreditation standards and conducting our exit interviews
