Content Validation of a School Effectiveness Measurement for. Accreditation Purpose



Similar documents
The Role of Community in Online Learning Success

Running head: SCHOOL COMPUTER USE AND ACADEMIC PERFORMANCE. Using the U.S. PISA results to investigate the relationship between

Principals Use of Computer Technology

What are the Effects of Comprehensive Developmental Guidance Programs on Early Elementary Students Academic Achievement?

Assessment of Online Learning Environments: Using the OCLES(20) with Graduate Level Online Classes

Rachel J. Goldberg, Guideline Research/Atlanta, Inc., Duluth, GA

AERA American Educational Research Association 2000, New Orleans: Roundtable

Fulda Independent School District 505

Utah Comprehensive Counseling and Guidance Program Evaluation Report

Model for Practitioner Evaluation Manual SCHOOL COUNSELOR. Approved by Board of Education August 28, 2002

AN INNOVATIVE INTEGRATED MATHEMATICS, SCIENCE, AND TECHNOLOGY EDUCATION TEACHER CERTIFICATION PROGRAM: CHARACTERISTICS AND FORMATIVE EVALUATION

Leadership and Learning: The Journey to National Accreditation and Recognition

A Performance Comparison of Native and Non-native Speakers of English on an English Language Proficiency Test ...

Uinta County School District #1 Multi Tier System of Supports Guidance Document

The School Psychologist s Role in Response to Intervention (RtI): Factors that influence. RtI implementation. Amanda Yenni and Amie Hartman

EXAMINING HEALTHCARE PROFESSIONALS ACCEPTANCE OF ELECTRONIC MEDICAL RECORDS USING UTAUT

Sense of Community (SOC) as a Predictor of Adult Learner Persistence in Accelerated Degree Completion Programs (ADCPs)?

The Personal Learning Insights Profile Research Report

ASSESSMENT: Coaching Efficacy As Indicators Of Coach Education Program Needs

Higher Performing High Schools

A STUDY OF WHETHER HAVING A PROFESSIONAL STAFF WITH ADVANCED DEGREES INCREASES STUDENT ACHIEVEMENT MEGAN M. MOSSER. Submitted to

Model for Practitioner Evaluation Manual SCHOOL PSYCHOLOGIST. Approved by Board of Education August 28, 2002

UNH Graduate Education Department. Quarterly Assessment Report

School Psychology Internship Program Applicant- Previous Coursework Evaluation Form

What Are Principal Components Analysis and Exploratory Factor Analysis?

Validation of the Treatment Related Impact Measure for Diabetes Treatment and Device: TRIM-Diabetes and TRIM-Device

Impact of ICT on Teacher Engagement in Select Higher Educational Institutions in India

RtI Response to Intervention

Middle Grades Action Kit How To Use the Survey Tools!

Research Brief: By: Orla Higgins Averill and Claudia Rinaldi, Urban Special Education Leadership Collaborative

Student Evaluation of Faculty at College of Nursing

School Administrators and the Importance of Utilizing Action Research

Assessment Coordinator: Bill Freese 214 Reid Hall

M.A. EDUCATIONAL PSYCHOLOGY

National assessment of foreign languages in Sweden

ty School District Digita al Classrooms Plan

The performance assessment shall measure the extent to which the teacher s planning:

MANAGER VIEW 360 PERFORMANCE VIEW 360 RESEARCH INFORMATION

IT S LONELY AT THE TOP: EXECUTIVES EMOTIONAL INTELLIGENCE SELF [MIS] PERCEPTIONS. Fabio Sala, Ph.D. Hay/McBer

Few things are more feared than statistical analysis.

Knowledge construction through active learning in e-learning: An empirical study

Issues in Information Systems Volume 15, Issue II, pp , 2014

Nebraska School Counseling State Evaluation

What is Rotating in Exploratory Factor Analysis?

California University of Pennsylvania Guidelines for New Course Proposals University Course Syllabus Approved: 2/4/13. Department of Psychology

Overview of Factor Analysis

STEPHEN J. KOFFMAN, LCSW

Spring School Psychologist. RTI² Training Q &A

The Teacher Educator Standards

An exploratory study of student motivations for taking online courses and learning outcomes

Teacher Evaluation. Missouri s Educator Evaluation System

Calculator Use on Stanford Series Mathematics Tests

Striving for Success: Teacher Perspectives of a Vertical Team Initiative

Use of Placement Tests in College Classes

A Guide to Curriculum Development: Purposes, Practices, Procedures

Segmentation: Foundation of Marketing Strategy

REILEY ELEMENTARY SCHOOL

Teacher Performance Evaluation System

Exploratory Factor Analysis

VIEWS OF STUDENTS IN THE DEPARTMENT OF RECREATION AND SPORT MANAGEMENT ON DISTANCE EDUCATION

Technological Attitude and Academic Achievement of Physics Students in Secondary Schools (Pp )

EXCHANGE. J. Luke Wood. Administration, Rehabilitation & Postsecondary Education, San Diego State University, San Diego, California, USA

Chapter 5. Summary, Conclusions, and Recommendations. The overriding purpose of this study was to determine the relative

Attitudes Toward Science of Students Enrolled in Introductory Level Science Courses at UW-La Crosse

ACT National Curriculum Survey Policy Implications on Preparing for Higher Standards. improve yourself

The Condition of College & Career Readiness National

North Dakota PROGRAM REVIEW FOR SCHOOL COUNSELING PROGRAMS

Tulsa Public Schools District School Counseling Program Elementary

Perception of Nigerian secondary school teachers on introduction of. e-learning platforms for instruction

Northern New Mexico College DRAFT

Bremen Public Schools High Ability Plan Grades K

How to report the percentage of explained common variance in exploratory factor analysis

The Effectiveness of Ethics Program among Malaysian Companies

How To Improve A Child'S Learning Experience

EVALUATION RUBRICS FOR COUNSELORS

Leadership Frames and Perceptions of Effectiveness among Health Information Management Program Directors

Exploring Graduates Perceptions of the Quality of Higher Education

Master of Education Online - Exit Survey

The Impact of Management Information Systems on the Performance of Governmental Organizations- Study at Jordanian Ministry of Planning

The Relationship between Social Intelligence and Job Satisfaction among MA and BA Teachers

EXAMINING STUDENTS ACCEPTANCE OF TABLET PC USING TAM

Comparing the Roles of School Counselors and School Psychologists: A Study of Preservice Teachers. Randall L. Astramovich and Scott A.

Checklist for the Professional Service License Application (out-of-state)

NATIONAL COMPETENCY-BASED TEACHER STANDARDS (NCBTS) A PROFESSIONAL DEVELOPMENT GUIDE FOR FILIPINO TEACHERS

74% 68% 47% 54% 33% Staying on Target. ACT Research and Policy. The Importance of Monitoring Student Progress toward College and Career Readiness

AN ASSESSMENT OF SERVICE QUALTIY IN INTERNATIONAL AIRLINES

ANNUAL PROGRAM REVIEW MS IN COUNSELING SCHOOL COUNSELING CONCENTRATION

Challenges Faced By Peer Counselors in Their Effort to Effect Behavior Change in Secondary Schools in Maara District, Kenya

H-7700/Student Wellness

Transcription:

Content Validity to Measure School Effectiveness 1 Content Validation of a School Effectiveness Measurement for Accreditation Purpose Xin Liang (Ph. D) Assistant Professor Department of Educational Foundations & Leadership University of Akron liang@uakron.edu Bin He Department of Mathematics University of Central Florida Robinhe_1@yahoo.com Richard Landry Professor Department of Educational Foundations & Research University of North Dakota Richard_landry@und.nodak.edu

Content Validity to Measure School Effectiveness 2 ABSTRACT Methods to identify school effectiveness and to collect data on student learning that are both timely and accurate are an ongoing discussion in public education. Besides the traditional assessment of student performance (most often test scores), more and more educators rely on data that reflects local school s operational process to uncover the needs, concerns and performances. However, whether the instrument measures what is purported to measure rarely gets scrutinized. If we believe accurate decision depends on accurate data, then instrument validation should not be taken lightly. The purpose of this study was to examine the content validity of a locally developed questionnaire to measure school effectiveness for NCA COS (North Central Association of Commission on Schools) accreditation in a rural Midwestern school district. Seventeen schools (6,018 participants) were included in the study. Exploratory factor analyses, item analyses, and correlation coefficient tests were conducted to examine the domain structure, test composition, and reliability among the sub-content areas. Comparison was made to link the NCA COS standards and the observed five domain areas of the NCA School Improvement Questionnaire (NCASIQ). The results indicated that the five sub-content domains derived from the analysis only measured two of the five NCA COS standards. The study also implied that the traditional validation approach could be used as a tool to help administrators to interpret data more accurately to make more effective decisions. Key words: school accreditation, school effectiveness, content validity, exploratory factor analysis.

Content Validity to Measure School Effectiveness 3 INTRODUCTION Each year, hundreds of schools at different levels go through multi-facet review process to be accredited. The major focus for schools in this process is not only just to be successful in accreditation, but also to self evaluate how the school performs to meet students needs. Further more, it is hoped that the accreditation process of clarifying the goals and objectives, systematically collecting data to determine the strength and weakness will accelerate school improvement. The underlining assumption in this process is effective educators make effective decisions, and effective decisions are based on accurate information (Johnson, 2000). One way to ensure more accurate data is to build a valid measurement. In today s data-driven practice, educational administrators and schools call for good evaluation instruments to ensure the quality of results that will document success and identify possible areas for improvement. Clearly the data obtained from an assessment methodology used for any professional purpose should be psychometrically appropriate and sound (Gall, Gall, & Borg, 2003). However, there is scant evidence that few such assessments are using psychometrically sound methodology (Thompson, 2003). To prepare for accreditation, schools or a school district need to initiate a data collection system to reflect the local school operational dynamics. One of the common approaches in this endeavor is to develop assessment meet the needs. By doing so, schools and school district will usually have more autonomy in determining when, where, and how to administer data collection according to the school need. Once the instrument is in place, the schools and school district can use the same measurement to consistently

Content Validity to Measure School Effectiveness 4 assess school effectiveness, and record progress to collect longitudinal data. Even though self- designed assessment has the advantage of reflecting the needs and concerns of the local stakeholders, psychometrical property is rarely addressed. NCA COS (North Central Association of Commission of Schools) is a school accreditation organization for the north central region which includes nineteen states. Its goal is to be proactive in promoting a system of education that enhances student learning, and ensures successful school transitions for its learners through the provision of standards and evaluation services for its schools (Corkill, 1998). Each year NCA COS accredits schools with standards encompassing five major areas: School Improvement Plan, Information System; Process of Schooling; Vision, Leadership-Governance; and School Community; Resources and Allocation (NCA COS, 2000, p. 1). To help schools well informed with the accreditation process, NCA has established a website to share information (NCA, 2003). Schools are also able to administer an online preliminary school effectiveness questionnaire by accessing the NCA website (NCA, 2003). However, many schools develop self-reported instruments with items that depict local school characteristics aligned with the five areas of the NCA COS standards. Even though the locally constructed school instrument was designed to measure school effectiveness with NCA COS evaluation standards, validity and reliability of these measurements were never reported. Consequently, it was hard to determine if the instruments truly contain the content areas that NCA COS standards encompass. There is a need to investigate the validity and reliability of measurement,

Content Validity to Measure School Effectiveness 5 and to examine to what extent the content domains were consistent with the NCA COS accreditation standards. The purpose of this study was to address this need. The administrators of the school district in the present study requested a validation study of the self-developed NCA COS School Improvement Questionnaire (NCASIQ). The personnel who developed the questionnaire were in the process of establishing an information system that could record longitudinal data about school effectiveness. They planned to integrate the questionnaire data into the information system. The administrators of the schools and school district particular concerned about whether the questionnaire was designed to measure what they were expecting to measure. For psychometric reasons, the sample size was large (N= 6,018) enough to conduct empirical analyses. Scores derived from the data analyses for NCA COS accreditation were relatively consistent with the local perceptions of the issues on school effectiveness. This claim was evidenced by the acceptance of the technical report and positive feedback from the personnel of the school district. The NCA School Improvement Questionnaire (NCASIQ) was administered in the spring 2002 for NCA COS accreditation purpose. The entire school district was accredited by NCA COS that year. The same data was used to conduct an ex post facto validation study in this report. METHOD Participants A total of 6,018 participants were included in this study, among them, 2,383 were parents (39.6% of the entire sample), 3,082 were students (51.2%), 242 were staff members (4.0%), 268 were teachers (4.5%), and 24 were administrators (0.4%). The

Content Validity to Measure School Effectiveness 6 percentage of males and females in the sample was about the same, and the ethnicity and social economic status of the participants reflected the demographic composition of a rural school district, mostly (over 95%) Caucasian. Females were over-represented in the parent sample, with 1,604 or 71.1% being females. Students were evenly distributed as to gender (Females = 48.8%, Males = 51.2%). As was true in the Parent category, females were over-represented among Staff members (76.0%), though they were somewhat under represented in the Administration category (29.2%). Most respondents (74.1%) were European Americans, although 11.9% were American Indian. All other ethnic groups combined made up 7.7% of the respondents; 379 (or 6.3%) did not respond to the racial/ethnic item. Questionnaire Development The 2002 NCA School Improvement Questionnaire was developed by a collective effort of school administrators, teachers, paraprofessional, parents, community members, and a research consultant from a nearby university. The research consultant met with the Board of Education and Superintendent to discuss the development of the questionnaire. The board provided input concerning what should be included in the survey. The consultant also met with the K-12 administrators and completed the same process as was done with the board. After collecting input from the local personnel, the consultant developed a draft questionnaire using the content suggested by the board and the administration. Fifty-eight items were included in the instrument, each with a Likert scale (1= strongly disagree, 2= agree, 3= neutral, 4= disagree, to 5= strongly agree) with 7 sub-

Content Validity to Measure School Effectiveness 7 domain areas: 1) Teaching and Learning (TL); 2) Student Behavior/Safety (SBS); 3) Support Services (SS); 4) Student Activities (SA); 5) Building Administration (BA); 6) Academic Program (AP); 7) Parent and Community Involvement (PCI). Thirteen items were in TL; 8 items in SBS, 6 items in SS, 4 items in SA, 4 items in BA, 20 items in AP, and 7 items in PIC, for a total of 58 items. Once the questionnaire was finalized, the distribution and data collection was discussed to assure consistency across the entire district. This process was administered via standard school protocol. Validation Analysis Procedures Demographic composition of the sampled population was calculated to obtain norm reference. Principal component with orthogonal iterations was generated to obtain initial factors for further examination. Alpha factoring method with varimax rotation was conducted to further examine the sub content factor loadings for each factor derived from the initial solution. The criteria for selection of a factor and for item saliency were (a) eigenvalue 1.00 and scree plot, and (b) factor structure/pattern.40, respectively (Ford, MacCallum, & Tait, 1986). Item analyses among each subtest, and internalconsistency reliability, were conducted to examine the internal consistency of the instrument. Item contribution, among each sub test, to the composite score was also examined to measure school improvement. It is expected that the subtests should contribute to the overall composite of NCASIQ significantly; but individually, the subdomain areas are insufficient measures of school effectiveness because it was the composite of these subtests that yielded a reliable measure (Nunnally, & Bernstein, 1994).

Content Validity to Measure School Effectiveness 8 RESULTS Factor Analyses Factor analysis is a basic tool for explicating constructs, and the major aspect of this explication was to determine the extent to which hypothesized measures of a construct actually measure the same thing versus break up into clusters of variables that measure different things (Nunnally, & Bernstein, 1994). This aspect of exploratory factor analyses was applied to examine the composition of the sub content areas of the locally developed NCASIQ. Initial solution was conducted to generate factors by clustering the items in the instrument into some smaller number of homogeneous sets. With the factors of homogeneous sets obtained in the iteration initial solution, a further examination of the common loadings across the derived factors was undertaken. This type of analyses leads to the identification of sub content areas that existed for measuring school improvement in the questionnaire. For the purpose of identification of underlying factors (Kachigan, 1986), the initial factors derived from principal-components analysis subsequently were rotated using orthogonal (varimax) rotation. Both five and seven factors were initiated to examine the factor loadings. Fcators were identified based on the pre-set criteria for factors and items. After examining the factor loadings with both sets (5 versus 7 components), five factors were decided as cutoffs for the initial solution as was demonstrated in the scree plot in Figure 1. The five factors accounted for 45.0% of the total common variance of the total instrument.

Content Validity to Measure School Effectiveness 9 Figure 1 Initial solution scree plot for principle component Figure 1: Scree Plot of the initial Solution of Principal Component As initial factors were typically difficult to interpret, a second stage of rotation then made the final result more interpretable (Nunnally, 1994). Varimax iteration method was conducted to examine the detail factor loadings, and variance among the different loadings. Varimax rotation is an iteration method to minimize factor complexity by maximizing variance for each factor (Mertler and Vannatta, 2002). The results demonstrated that items were clustered around five sub content areas labeled as Factor 1: Teaching and Learning (13 Items), Factor 2: Building Administration/Support & Parent Involvement/Community (17 Items), Factor 3: Academic Program (16 Items), Factor 4: Student Behavior/ Safety (6 Items), and Factor 5: Student Activities/Resources (6 Items).

Content Validity to Measure School Effectiveness 10 Therefore, the questionnaire actually measured five content areas, instead of seven as was originally designed. The items clustered in Factor 1 Teaching & Learning, and Factor 3 Academic Program remained the same as in the original sub content areas. Noticeably, the items in these two content areas consisted of more than half of the total items (32 items out of 58). All the 6 items loaded in the SS (Student Services) subtest were below.40, and they all clustered on the Parent Involvement & Community subtest, along with all the items of Building Administration. Table 1 presents the items and the loading values for the five factors. The factor structure/pattern coefficients of.40 are in boldface to indicate the items most salient with each factor. Table 1: NCASIQ Items and Factor Loadings Factors Loading h 2 Test/Item Number I II III IV V Teaching & Learning Teachers have pos relationship w/students 1.525.225.100.225.042 Teachers use class time effectively 2.600.202.148.152.098 Teachers provide quality assignments 3.593.225.166.221.111 Classroom work is challenging for students 4.393.043.159.151.127 Teaching techniques used are effective 5.596.243.173.199.121 Students expected to do quality work in class 6.551.103.169.104.208 Students believe their teachers are fair 7.519.378.126.311.059 Students learning to use technology 8.402.174.177.157.161 Academic expectations for students clearly stated 9.493.237.188.154.196 Teachers understand/meet needs of each students 10.540.355.168.272.145 Teachers enjoy teaching 11.496.325.146.196.103 Teachers have high expectation for students 12.546.162.196.130.181 Teachers respect students 13.525.413.154.255.093 Student Behavior & Safety Students discipline in fair/consistent manner 14.369.401.119.323.113 Students show respect for teachers/other adults 15.234.170.132.564.090 Students feel safe at school 16.303.363.130.452.109 Students feel sense of belonging in school 17.360.394.128.464.111 Students support and respect each other 18.227.215.126.627.118

Content Validity to Measure School Effectiveness 11 Students behave appropriate in school 19.225.190.135.682.057 Students accept differences in others 20.225.245.096.487.125 Building facilities are well maintained 21.386.450.091.211.108 Support Services Health service provides adequate care/students 22.294.337.210.100.203 Counselor helps students w/personal problems 23.121.310.220.083.346 Counselor helps students w/education plan 24.081.213.295.112.333 School lunch is healthy/satisfying 25.173.340.215.211.126 Media center has adequate resources 26.255.307.228.112.261 School district supplies well to students 27.257.379.227.167.300 Student Activities Extracurricular activity appropriately balanced 28.124.166.157.122.594 Students given opportunity to participate 29.155.112.196.028.559 Extracurricular activity are an important part 30.124.103.101.044.481 Extracurricular activity are high quality 31.130.149.204.111.575 Building Administration Principal provides effective leadership 32.234.579.150.183.195 School practice/procedures non-discriminate 33.295.484.168.161.228 Office staff is friendly/courteous 34.266.451.145.181.116 Principal effective with discipline 35.242.543.122.151.196 Academic Programs Students receive quality instruction/health 36.267.306.399.080.228 Students receive quality instruction/pe 37.305.227.365.193.275 Students receive quality instruction/reading 38.385.303.444.116.123 School program meets academic needs of at risk 39.203.346.369.152.214 Students receive quality instruct/math 40.381.273.373.034.184 School programs meet behavior needs of at risk 41.193.326.369.193.227 Students receive quality instruct tech/voc ed 42.250.142.452.076.181 Students receive quality instruct/spelling 43.360.278.445.092.087 Gift/talent students appropriately challenged 44.150.091.444.205.199 Students receive quality instruction/bus ed 45.226.082.574.211.124 Students receive quality instruct/foreign language 46 -.109.007.453.173.065 Students receive quality instruct/eng-language arts 47.304.226.574.012.118 Students receive quality instruction/writing 48.333.229.573.050.100 Students receive quality instruction/science 49.299.174.501.021.156 Students receive quality instruction/soc science 50.305.249.425 -.040.149 Students receive quality instruction/fine arts 51.210.218.448.076.156 Parent Involvement & Community Community values students achievement 52.219.343.301.134.225 School community effectively w/me 53.234.570.241.194.156 I feel comfortable talking w/teachers 54.297.551.186.140.063 I feel comfortable talking w/principal 55.162.635.129.131.057

Content Validity to Measure School Effectiveness 12 Parents involved in school managements/activities 56.074.414.275.145.193 Parents have input in school matters 57.128.432.278.140.208 Parents involved in their child's education 58.125.345.269.096.222 The exploratory factor analysis of the NCASIQ suggested that the content structure of the assessment mirrored five content areas: 1) teaching and Learning, 2) curriculum programs, 3) student behavior, 4) student activities, and 5) school administration/ governance/ community instead of seven initiated in the questionnaire. The factor model did not match what was designed to measure for the five domain areas for NCA COS standards: 1) School Improvement Plan, 2) Information System, 3) Process of Schooling, 4) Vision, Leadership-Governance & School Community, and 5) Resources and Allocation. Further examination was needed to locate which content domains were aligned with standards, and which did not, so that modifications can be made. Item Analyses and Internal Reliability Once the content composition of the instrument was identified, the relationships among the five sub content areas were examined to determine the individual sub domain contributions to the composite score. One of the basic criteria for the content validity of an instrument is, at least a moderate level of internal consistency should exist among the items; i.e., the items should tend to measure something in common (Nunnally, & Bernstein, p. 103). In order to examine the consistency of the items in each factor, and the overall consistency of the instrument, item analysis for each factor was conducted. Then item analysis for the instrument as a whole was generated to obtain the contribution of each item to the composite score. Coefficient alpha was also calculated. Item

Content Validity to Measure School Effectiveness 13 analyses indicated that the items in each factor had positive contribution to the composite score of the sub content area. The coefficient alpha of the five factors were.90 (TL),.90 (ASPIC),.88 (AP),.85 (SBS), and.75 (SAS). No negative contribution was found among the 58 items for the composite score of NCASIQ. The reliability coefficient alpha for NCASIQ was.96, indicating the instrument was reliable. Table 2 presents the internal consistency information of both the total NCASIQ and the five content areas derived from factor analyses were listed in the table respectively.

Content Validity to Measure School Effectiveness 14 Table 2: Internal Consistency Reliability of Items on the NCASIQ Five Sub Content Areas Based on the Factor Analyses (N=6,018) Reliability Statistics NCSIQ five sub content areas T&L PIC AP SBS SAS Total Number of Items in test 13 17 15 6 6 58 Item mean 3.08 2.94 2.99 2.80 3.06 2.98 Item Variance.392.483.339.463.346.407 Mean corrected Inter-item correlations.615.563.547.639.483.536 Mean of inter- item correlations.420.353.341.492.328.299 Reliability Coefficient Alpha.904.902.883.853.745.960 DISCUSSION The empirical examination of the domain areas of the self-reported assessment for NCA COS school effectiveness accreditation purpose depicted mixed results. On one hand, NCASIQ is a reliable measurement. This satisfied an important assumption to determine validity. The item analysis examined both individual and collective contribution to the composite score of the instrument. The outcome was positive, indicating each item was a significant contributor to NCASIQ. Correlation coefficient demonstrated that the instrument was reliable. Each sub test was consistent in measuring a sub domain area with a moderate to high coefficient alpha.

Content Validity to Measure School Effectiveness 15 Reliability is only a prerequisite for validity. Quite often a reliable instrument does not necessary measure what is designed to measure in the predetermined content area. The exploratory factor analysis of the NCASIQ suggested that the content structure of the assessment mirrored the different aspects teaching and Learning, 2) curriculum programs, 3) student behavior, 4) student activities, and 5) school administration/ governance/ community. The five domain areas for NCA COS standards were : 1) School Improvement Plan, 2) Information System, 3) Process of Schooling, 4) Vision, Leadership-Governance & School Community, and 5) Resources and Allocation. It is worth noting that in Standard 3-School Processing, NCA COS specifically defined the content into four categories: 1) teaching-learning, 2) curriculum, 3) student activities, 4) pupil service as the major content areas for evaluation of the school processing effectiveness (NCA COS, 2000). In contrast with NCA COS standards, four out of five of the observed factors from the study did seem to match with the four sub-content domains in Standard 3- School Processing. The four observed factors from the study were 1) Teaching & Learning, 2) Academic Program, 3) Student Activities, 4) Student Behavior/Safety. The other 16 items clustered on the NCASIQ observed factor 2 was Building Administration/Support & Parent Involvement/Community. This content domain was quite similar with NCA Standard 4- Vision, Leadership-Governance & School. Table 3 illustrated the details of NCA COS standards and the observed factors in NCASIQ and how the two were compared and matched in terms of content domains.

Content Validity to Measure School Effectiveness 16 Table 3: Comparison of NCA COS Standards and Observed Factors of NCASIQ NCA COS standard Observed Factor in NCASIQ Match School Improvement Plan Information System School Processing No No Yes 1. Curriculum Academic Program Yes 2. Teaching-Learning Teaching & Learning Yes 3. Pupil Personnel Service Student Behavior/Safety Yes 4. Student Activities Student Activities Yes Vision, Leadership-Governance & School Community Resources and Location Building Administration/Support & Parent Involvement/Community Yes NO It is quite obvious that the self-developed NCASIQ was actually measuring only two of the five NCA COS accreditation standards in the areas of School Processing and Leadership-Governance & School community. The other three standards, School Improvement Plan, Information System, and Resources and Location were absent from this particular instrument. With a closer look at the NCA COS standards, we found that Standard 1-School Improvement Plan, Standard 2- Information System, and Standard 5- Resources and Location could be evaluated and observed directly via different data resources rather than instrumental measurement. For example, NCA COS standard 1 School Improvement Plan could be reviewed by the NCA Commission with a checklist at the site. The data derived from a well-designed instrument with assessment outcomes of school effectiveness was one of the strong indicators for an information system. The criteria for resource and resource allocation would be the school infrastructure in terms of

Content Validity to Measure School Effectiveness 17 teacher/student ratio, finance, quality of school building. These aspects of school effectiveness indicators could be measured with standardized documentations. However, it is more challenging for schools to evaluate school processing, leadership, school administration-governance and community with well-documented data resources. School administrators and instrument developers need to communicate with school personnel, community members, and parents to specify the definitions and indicators of the effectiveness of school processing, vision, leadership and community relations. Without a clear and consensus benchmark, the measurement is not substantiated with a solid support and foundation. Consequently, the data results will less likely be of any use for efficient decision. As a self-developed measurement, the NCASIQ could serve as a very rough estimation in assessing certain dimensions of the NCA COS standards. Further procedures need to be implemented to clarify the intended domain areas aligned with the standards. This is a common mistake for practitioners engaged in assessment activities. Especially, when pressured by accomplishing a comprehensive task such as accreditation in a timely manner with limited budget and little technical support, school administrators tend to overlook data accuracy. The description of the instrument development process, and the techniques demonstrated in this study could be a reference for similar schools or school districts in collecting data for accreditation purpose in the future. The study also implied that the traditional validation approach could be used as a tool to help administrators to interpret data more accurately to make more effective decisions. Even though NCA COS is one of the major school accreditation and evaluation organizations, no study has been conducted to examine the evaluation standards in

Content Validity to Measure School Effectiveness 18 contrast with a measurement matching up with the local performance to document progress. It is our hope that the exploratory analysis of this study could be used as a benchmark for future researchers and instrument developers with reliability and validity information to measure school effectiveness for accreditation and other evaluation purposes. Like successful business people and effective companies, educators and school districts are being asked to prove their bottom line with hard, solid data (Lafee, 2002). Improvement must be reflected in such a way that the results clearly indicate that the implementation of an improvement plans accomplishes its selected goals. A welldesigned evaluation instrument will ensure accurate data for an assessment system that warrants continuous improvement efforts linking improvement, evaluation, and research together as a continuum (McNamara & McNamara, 2000).

Content Validity to Measure School Effectiveness 19 References Corkill, M. P. (1998). Assessing the North Central experience: A superintendent s perspective. NCA Quarterly, Volume 72, No. 3, pp. 12-17. Ford, J.K, MacCallum, & Tait, M. (1986). The application of exploratory factor analysis in applied psychology: A critical review and analysis. Personal Psychology, 39(1).pp. 291-314. Gall, M.D., Gall, J. P., Borg, W. R. (2003). Educational research: An introduction (7 th ed.). Boston. Hillsdale, NJ: Lawrence Erlbaum. Johnson, H. J. (2000). Data driven school improvement. Journal of School Effectiveness, (1)1, pp. 25-33. Kachigan, S. K. (1986). Statistical analysis: an interdisciplinary introduction to univariate & multivariate methods. New York: Radius Press. Lafee, S. (2002). Data drive districts. School Administrator, 12(2), pp. 6-15. McNamara, J. F & McNamara, M. (1999). Constructing new measures in program evaluation. International Journal of Educational Reform. V8.n1 p99-104 Mertler, A.C., & Vannatta, A.R. (2002). Advanced multivariate statistical method. Pyrczak Publishing: Los Angeles, CA. Nunnally, J. C.,& Bernsteain, I. H. (1994). Psychometric theory 3 rd Edition. McGraw- Hall Series in Psychology, New York. NCA COS (2000). Standards and criteria for elementary, middle level, secondary and unit schools. Tempe, AZ: North Central Association Commission on Schools. NCA website. http://www.ncacasi.org/sitools/, retrieved May, 24, 2003. Thompson, W. D, Loesch, C. L., & Seraphine, E. A. (2003). Development of an instrument to assess the counseling needs of elementary school students. Professional School Counseling, 7(1), p. 35-39.