Resolving ABET/TAC Criteria on Continuous Improvement: Surviving ABET Accreditation! by Nasser Michigan Technological University alaraje@mtu.edu Abstract: The Electrical Engineering Technology program developed a Program Outcomes (POs) and Program Educational Objectives assessment process in response to Accreditation Board of Engineering Technology (ABET) requirements. The EET Program Outcomes capture the desired attributes that the EET program aspires to impart on its through the curriculum and the academic experience 1. More precisely, what should graduates know upon graduation. On the other hand, Program Educational Objectives (PEOs) describe the attributes that graduates are expected to possess three to five years after graduation 1. This paper details the assessment process developed by the program, as well as the implementation process that took place in the academic year 007-008. An interim report submitted to ABET resulted in the resolution of the institutional weakness regarding ABET criterion 3 (Assessment and Evaluation) 1. The EET POs and PEOs are identified in line with ABET s Technology Accreditation Commission (TAC). For each Program Outcome, a combination of direct and indirect assessment tools has been identified in addition to the performance criteria of each assessment tool. The Program Educational Objectives are measured using a set of indirect assessment tools. This paper will discuss the POs and PEOs, their assessment tools, how these tools are used in assessment, how frequently data is collected for each assessment tool, who is responsible for data collection, analysis, and how data is used for continuous improvements. This paper provides information on the assessment process issues and challenges, and will be of benefit to engineering technology programs seeking accreditation or re-accreditation. I. Introduction The curriculum of the Electrical Engineering Technology program covers a broad-based educational experience emphasizing practical, hands-on laboratory work, closely coordinated with theoretical classroom discussion. Students receive a solid foundation of coursework in electric circuits, digital electronics, solid-state electronics, communications, power and electrical machinery. The Electrical Engineering Technology program has developed a Program Outcomes (POs), and a Program Educational Objectives (PEOs) assessment process to fulfill ABET accreditation requirements. It is an outcome-based assessment in which the POs and PEOs should meet the needs of the program constituents. Program Outcomes (POs) capture the desired attributes that the EET program at the School of Technology aspires to impart to its through the curriculum
and academic experience. According to ABET s definition of POs, Program Outcomes are narrower statements that describe what are expected to know and be able to do by the time of graduation 1. On the other hand, PEOs describe the attributes that the program desires its graduates to possess three to five years after graduation. According to ABET s definition of PEOs, Program Educational Objectives are broad statement that describe the career and professional accomplishments that the program is preparing graduates to achieve 1. A set of assessment tools with performance criteria has been identified for each PO and PEO. Data has been regularly collected, assessed and evaluated against the performance criteria to ensure that each PO and PEO has been assessed and goals have been met. Results of assessment process are then utilized in continuous improvement actions. II. EET Program Outcomes The EET Program Outcomes capture the desired attributes that the EET program at the School of Technology (SoT) aspires to impart to its through both the curriculum and academic experience. The desired outcomes of the EET program were adopted from ABET s (a) through (k) outcomes shown in Table 1 as Outcomes 1 through 11. The ABET s Electrical/Electronics Engineering Technology Program Criteria Outcomes are mapped to Outcomes 1 through 16 as shown in Table 1. PO 1 An appropriate mastery of the knowledge, techniques, skills and modern tools of the discipline (ABET.a ) PO PO 3 An ability to apply current knowledge and adapt emerging applications of mathematics, science, engineering and technology (ABET.b ) An ability to conduct, analyze and interpret experiments and apply experimental results to improve processes (ABET.c) PO 4 An ability to apply creativity in the design of systems, components or processes appropriate to the program objectives (ABET.d ) PO 5 An ability to function effectively on teams ( ABET.e ) PO 6 An ability to identify, analyze and solve technical problems (ABET.f ) PO 7 An ability to communicate effectively (ABET.g ) PO 8 A recognition of the need for and ability to engage in lifelong learning (ABET.h ) PO 9 An ability to understand professional, ethical and social responsibilities (ABET.I ) A respect for diversity and a knowledge of contemporary professional, societal and global issues (ABET.j PO 10 ) PO 11 A commitment to quality, timeliness and continuous improvement (ABET.k) PO 1 PO 13 The application of circuit analysis and design, computer programming, associated software, analog and digital electronics, and microcomputers to the building, testing, operation and maintenance of electrical/electronic(s) systems. (ABET 8.a) The application of physics or chemistry to electrical/electronic(s) circuits in a rigorous mathematical environment at or above the level of algebra and trigonometry. (ABET 8.b) The ability to analyze, design and implement control systems, instrumentation systems, communication PO 14 systems or power systems. (ABET 8.c) PO 15 The ability to apply project management techniques to electrical/electronic(s) systems. (ABET 8.d) PO 16 The ability to utilize statistics/ probability, transform methods, discrete mathematics, or applied differential equations in support of electrical/electronic(s) systems. (ABET 8.e) Table 1 Electrical Engineering Technology Program Outcomes 1
III. Program Outcomes Assessment Tools and Timelines The EET program developed a set of assessment tools to measure the POs defined in Table 1, in response to ABET Criterion 3 (Assessment and Evaluation). The EET program POs assessment tools include a combination of direct and indirect measures,3,4 of the EET Program. Examples of direct measures are the Senior Project Evaluation, the End of Semester Course Assessment and the Senior Exit Exam. The Senior Exit Survey and Student Rating of Instructions are examples of indirect measures. Table summarizes the various tools used to collect data in order to assess the Program Outcomes. The table also identifies the person responsible for collecting and analyzing the data as well as the frequency of the assessment tool. It is important to have a process in place detailing the timelines for data collection of each assessment tools because not every assessment tool needs to be implemented every semester. A brief description of each tool is provided below. PO Responsible for Assessment Tool Data Collection/Analysis Frequency Course Assessment Faculty Semester Student Rating of Instruction Center for Teaching, Learning, & Faculty Development Semester (All Courses) Senior Exit Survey SOT Staff Semester Senior Project Evaluation Faculty Annually Senior Exit Exam Faculty Semester Table : Summary of Assessment Tools Course Assessment All faculty members are required to conduct individual course assessment at the end of each semester. Data gathered during this process is used to make adjustments and improve the student learning experience. Each course objective is linked to POs, and for each course objective, the faculty member chooses an assessment tool or set of assessment tools which can include assignments, labs, exams, quizzes, and performance projects. The achievement standard is that 70% of the perform at a level of 70% or for each of the course objectives which are linked to the program outcomes (ABET C. a-k, C.8 a-e). Based on the results of course assessment, a list of continuous improvements actions is compiled and recommended for implementation of the next course offering, example of course assessment tool is shown Table 3. Student Rating of Instruction A standard university instrument administered by the Center for Teaching, Learning and Faculty Development is used to obtain student feedback regarding instructor performance in the classroom. There are twenty on the survey, the first eighteen are based on best practice and covers not only curriculum but also classroom and lab facilities. On the other hand, question 19 and 0 are intended to elicit feedback on their overall assessment of the instruction. Students are also encourage to provide a written comments to help improvements of teaching practice. Each faculty member will also add extra to elicit responses from on their overall assessment of achieving the declared course objectives which are linked to the program outcomes (ABET C. a-k, C.8 a-e).
Course: EET141 Digital Systems/Microprocessor Basics Instructor: Semester: Spring 008 Course Objective (See syllabus, Tab, for the complete statements.) Relates to Program Outcome(s) Assessment Instrument for This Objective Standard Results /N Continuous Improvement Actions Planned 1. Understand the basic logic gates and combinational logic functions, symbols, truth tables, timing diagrams, and logic circuits.. Simplify complex logic circuits by applying Boolean algebra laws and theorems and Karnaugh mapping. 3. Understand the operation of basic counters, decoders, multiplexers and arithmetic circuits. 4. Understand binary, BCD, the parity method for error detection, and the need for alphanumeric codes, especially the ASCII code. 5. Perform binary addition, subtraction, multiplication, and division on binary (using the 's-complement system) and hexadecimal numbers. 6. Understand the basic types of flip-flop. 7. Use modern computer tools for digital design verification 8. understand the characteristics of modern programmable logic devices b, degree 8a, degree c, degree 8a, degree c, degree d, degree 8c, degree 1 d, degree e, degree 1 g, degree 1 d, degree e, degree 1 g, degree 1 d, degree e, degree 1 g, degree 1 d, degree e, degree 1 g, degree 1 d, degree e, degree 1 g, degree 1 4(,3,6,,11,17,7), Lab(1,) 1(I),3(II), 4(4,5,7,18,19, 0,1,5, 6,30) Lab(3) 1(II), (II), 3(I, II), 4(10, 1, 13,14,15,1,,3,4, 8), Lab (8,9) 4(1), Lab (4) 4(9), Lab (5,6) (I, II), 4(6, 9)Lab (7) Computer Project (1,) 4(16), Computer Project (1,) on this question block. on this question block. on this question block. 83.1% of on this question block 78.3% of on this question block 71.9% of on this question block 80.5% of 9.5% of 7% of 83.3% of 71% of Table 3: End of Semester Course Assessment Example
Senior Exit Exam The Senior Exit exam is administered to measure student competence in areas identified as critical to the EET Program. The exam consists of 50 multiple choice with five for each representative course. It is anticipated to use a nationally normed assessment tool currently under development by Electrical and Computer Engineering Technology Department Head Association (ECETDHA) and supported by The Institute of Electrical and Electronics Engineers (IEEE), when it becomes available. Senior Project Evaluation The Senior Project Evaluation comprises the student performance on the senior capstone project as measured by an examiner. The examiner attends the project presentations at the end of the semester and assesses each student based on relevant criteria using a level ranking assigned to quantify the senior project examiner s opinion. Each project was assessed by at least two examiners drawn from the Industrial Advisory Board and faculty members. A rubric is developed to help in assessing performance on senior project. Senior Exit Survey The EET program has developed a written questionnaire for graduating called the Senior Exit Survey which all graduating seniors are asked to complete. It was completed by 17 of 1 graduating seniors with almost 81% excellent response rate. The Students feedback data is used to help identify any emerging trends in either a positive or a negative direction. IV. EET Program Educational Objectives The EET PEOs describe the attributes that the EET program desires the graduates to possess three to five years after graduation through both the curriculum and the academic experience 1. The PEOs support the mission of the School of Technology and the EET program. The EET program constituencies are the program faculty members, the EET IAB, the EET Alumni and their Employers. The PEOs were established based on input from the EET program constituencies and scheduled for review and update on a five year interval. When setting the PEOs, it is important to make a distinction from the Program Outcomes (POs). While PEOs describe the career and professional accomplishments the graduates possess three to five years after graduation 1, the POs are much narrower statements and describe what are expected to know upon graduation 1. Accordingly, the set of assessment tools for PEOs will rely on both alumni and employer surveys. Table 4 shows the EET PEOs. Table 4 Electrical Engineering Technology Program Educational Objectives 1 PEO 1 PEO PEO 3 PEO 4 Graduates of the program will be well prepared for their first position in the field. Graduates of the program will be successfully employed in a degree related job or pursing an additional degree. Employers will be satisfied with the performance of the program graduates, skills including: Effective teamwork and communication in a professional environment, and professional ethics. Graduates of the program will be satisfied with their education and show the ability to continuously improve their skills and professionally adapt to changes in the field.
V. Program Educational Objectives Assessment Tools and Timelines The EET program developed a set of assessment tools to measure the PEOs defined in Table 4, which include the job placement date, the alumni survey, the employer survey, and the input from the IAB. Table 5 summarizes the various tools used to collect data in order to assess the program PEOs, it also identifies the person responsible for collecting and analyzing the data, as well as the frequency of the assessment. Setting a timeline 5 will help simplify the tasks associated with the assessment process because not every PEO needs to be assessed every semester 5. The EET program is moderately small in size and access to its alumni and their employers can be a very challenging task. Having both the alumni and employer surveys conducted every three years will yield more robust data with increased survey response rate. The program also used both job placement and feedback from the IAB as another set of assessment tools, this will help provide a flexible assessment process by not relying solely on the alumni and employer surveys when assessing the program PEOs. Also, this will engage the IAB in periodically assessing the PEOs. A brief description of each assessment tool is provided below. Table 5: Summary of Assessment Tools PEO Responsible for Assessment Tool Data Collection/Analysis Frequency Job Placement University Career Center Semester Alumni Survey Employer Survey University Career Center/ SoT Staff University Career Center/SoT Staff Triennially Triennially Input from Industrial Advisory Board Faculty Annually Job Placement Data Data from the University Career Center on graduates job placement reflects how successful our graduates are in securing a job in a related field. Alumni Survey The alumni survey is a written questionnaire which alumni are asked to complete. Data will be collected every three years. The data will be analyzed and used in continuous improvement. Employer Survey The employer survey is a written questionnaire which employers of the program s graduates are asked to complete. Data will be collected every three years. Results of the data analysis will be used to enhance and strengthen the program. Input from Industrial Advisory Board IAB The EET-IAB assists the program in keeping current and relevant with industry. Input from the IAB is collected every year in the program annual spring meeting. This input is used to make continuous improvements to the program.
VI. Program Outcomes Assessment Results - Continuous Improvements The Assessment Process used different tools for different Outcomes. Table 6 shows the assessment tools that were employed and the achievement standard for each of the program Outcomes (POs), the overall achievement results for the POs are evaluated. The raw data collected has been processed and the final results are presented here. The average scores from each assessment tool utilized were compared to the performance criteria passing threshold, and results of data evaluation are used to effectively improve the EET program
Table 6: Assessed Program Outcomes PROGRAM OUTCOME Course Assessment Senior Exit PO1:.(a) an appropriate mastery of the knowledge, techniques, skills and modern tools of their disciplines Survey Senior Exit Exam Senior Project Evaluation Student Rating of Instruction Performance Criteria: the Performance Criteria: For each Performance Criteria: Performance Criteria: the Performance Criteria: Rating of demonstrate mastery of ability relevant to the Instruction above a course objectives relating this outcome, the demonstrate demonstrate mastery 3.5/5 to this program outcome weighted average of responses will be at least.80/4.0. 17 responses, 81% response rate mastery of specified outcomes as measured by the exit exam. of specified outcomes as measured by the Senior Project. Achieved Results Achieved Results Achieved Results Achieved Results Achieved Results Spring07: 59.14% Fall07: 58.44% this objective was 90%..8 Spring08: 68% PO:.(b) an ability to apply current knowledge and adapt to emerging applications of mathematics, science, engineering and technology PO3:.(c) an ability to conduct, analyze and interpret experiments and apply experimental results to improve processes PO4:.(d) an ability to apply creativity in the design of systems, components or processes appropriate to program objectives PO5:.(e) an ability to function effectively on teams PO6:.(f) an ability to identify, analyze and solve technical problems PO7:.(g) an ability to communicate effectively this objective was 8%. this objective was 87%. this objective was 88%. this objective was 88%. this objective was 88%. this objective was 89%..95.875.8.9 3.15.875 Spring08: 7.8% 91.6% of the demonstrated mastery of the specified outcomes Spring08: 69.49% The senior project written results show 66.6%. The senior project oral
PO8:.(h) a recognition of the need for, and an ability to engage in lifelong learning PO9:.(i) an ability to understand professional, ethical and social responsibilities PO10:.(j) a respect for diversity and a knowledge of contemporary professional, societal and global issues PO11:.(k) a commitment to quality, timeliness, and continuous improvement PO1: 8.(a) The application of circuit analysis and design, computer programming, associated software, analog and digital electronics and microprocessors to the building, testing, operation and maintenance of electrical/electronic systems. PO13: 8.(b) The application of physics or chemistry to electrical/electronic circuits in a rigorous mathematical environment at or above algebra and trigonometry PO14: 8.(c) The ability to analyze, design and implement control systems, instrumentation systems, communication systems, computer systems or power systems. PO15: 8.(d) The ability to apply project management techniques to electrical/electronic(s) systems PO16: 8.(e) The ability to utilize statistics/probability, transform methods, discrete mathematics or applied differential equations in support of electrical/electronic(s) systems this objective was 94%. this objective was 100%. this objective was 91%. this objective was 87%. this objective was 9%. this objective was 9%. this objective was 93%. this objective was 100%. 3.15 3.175 3.05 3.187 presentation show 91.6% results 88.8% of senior project teams earned marks of Excellent or Competent 88% of senior project teams earned marks of Excellent or Competent Spring08: 70.13% Spring08: 68.3% Spring08: 68.04%
VII. Program Educational Objectives Assessment Results Continuous Improvements The PEO assessment process used different tools for different objectives. Table 7 shows the assessment tools that were employed and the achievement standard for each of the PEO. The overall achievement results for the PEOs were evaluated and the raw data collected has been processed and the final results are presented here. The average scores from in each tool used were averaged to get the final result for that particular PEO. The results were then compared to the performance criteria passing threshold. Overall, the results revealed no major shortcomings in assessing the EET PEOs, i.e. the overall averaged results were above the desired target. Analysis of assessment results shows that the EET PEOs have been evaluated and goals were met. Assessment results of job placement data and feedback from the IAB reflect more accurate results than the employer and alumni surveys. This could be as a result of: Access to program alumni and their employer is a challenging task. The 004-005 cycle survey showed only 15. % response rate, this can be linked to being a small program with about 15-0 graduates per year. Conducting the survey every three years will definitely help improve the survey response rate and get more robust data. The employer survey will be redesigned to be able to measure PEO3 6, and the new cycle of surveys should reflect a assessment of PEO3.
EET Program Assessment Tools & Performance Criteria Educational Objectives PEO1: Graduates of the program will be well prepared for their first position in the field. PEO: Graduates of the program will be successfully employed in a degree related job or pursing an additional degree PEO3: Employers will be satisfied with the performance of the program graduates, skills including: Effective teamwork and communication in a professional environment, and professional ethics. Job Placement: EET graduates are currently employed in a degree related job as reported by university job placement data. Alumni Survey: EET Alumni responding to the Alumni Survey will indicate they rate the overall quality of their EET educational experience as good or. EET graduates responding to the Alumni Survey Questionnaire will indicate they are 1) currently employed in a position directly related to their education, ) have accepted a job offer for a position directly related to their education, 3) have at one time worked in a position directly related to their education since graduation, or 4) are currently pursuing an additional college degree. Alumni Survey: one or more abilities are listed which reflect this objective. Alumni are asked to rate the quality of preparation to demonstrate each ability they feel they received from their MTU education. For each ability relevant to this Objective, the weighted average of responses will be at least.80. Employer Survey. EET employers responding to the Employer Survey will indicate they are either very satisfied or satisfied with EET graduates performance The EET advisory board: will meet annually and provide feedback to improve the quality of the program, also evaluate the senior project design teams. Achieved Results Fall 007 (86%), Spring 008 (90%) 100% of EET Alumni rated the quality of EET education as good or. 67% of the respondents in 004/005 indicated one of the four desired responses (the goal is not met here). 004/005 Alumni Survey Questionnaire Weighted average: 3.18, 10 responses, 15.% response rate We cannot clearly determine the satisfaction rating of MTU BS-EET employers from the 004/005 employer survey results due to a survey design flaw which was corrected in the new survey. The EET IAB met twice during the academic year, and recommended actions are discussed in the respective IAB meeting. IAB are also part of the assessment team for senior project. PEO4: Graduates of the program will be satisfied with their education and show the ability to continuously improve their skills and professionally adapt to changes in the field. Alumni Survey: one or more abilities are listed which reflect this objective. Alumni are asked to rate the quality of preparation to demonstrate each ability they feel they received from their MTU education. For each ability relevant to this Objective, the weighted average of responses will be at least.80. 004/005 Alumni Survey Questionnaire Weighted average: 3.18, 10 responses, 15.% response rate Table 7: Assessed Program Educational Objectives
VIII. Conclusion This paper provides guidance on the Program Outcomes and Program Educational Objectives Assessment Process developed and implemented by the Electrical Engineering Technology Program to ensure compliance with ABET Criterion 3 (Assessment and Evaluation). The Program Outcomes and Program Educational Objectives are identified in line with ABET s Technology Accreditation Commission (TAC). While, the PO represents a narrower statements that describe what are expected to know and be able to do by the time of graduation, the PEO represents broad statements which describe the career and professional accomplishments that the program is preparing graduates to achieve. The overall average results for each PO and PEO revealed no major shortcomings in the EET achievement; however, the results of the assessment tools identified some areas which might benefit from improvement. Based on this assessment, recommendations are made for the purpose of continuous improvement of the EET program. References [1] Criteria for Accrediting Engineering Technology Programs, Technology Accreditation Commission, ABET, Inc., Baltimore, Maryland, 007. [] R. Terry et all, The Use of Direct and Indirect evidence to assess university, program, and course level objectives and Student Competenecies in Chemical Engineering, ASEE Annual Conference & Exposition (ASEE 007), June 007. [3] J. Shaeiwitz and D. Briedis, Direct Assessment Measures, ASEE Annual Conference & Exposition (ASEE 007), June 007. [4] G. Rogers, Direct and Indirect Assessments: What Are They Good For?, Community Matters, August 006. [5] G. Rogers, Establishing Timelines and Responsiblities An Example, From Assessment Planning Flow Chart 004, Gloria M. Rogers, Ph.D [6] Rogers, G., Surveys and Questionaires: Do They Measure Up?. Assessment Tips With Gloria Rogers, Communications Link is a publication of ABET Inc; retrieved from www.abet.org, January 008. Biography: Dr. s research interests focuses on processor architecture, System-on-Chip design methodology, Field-Programmable Logic Array (FPGA) architecture and design methodology, Engineering Technology Education, and hardware description language modeling. Dr. is currently the Electrical Engineering Technology program chair as well as a faculty member at Michigan Technological University, he taught and developed courses in Computer Engineering technology area at University of Cincinnati, and Michigan Technological University. Dr. is a Fulbright scholar; he is a member of American Society for Engineering Education (ASEE), a member of ASEE Electrical and Computer Engineering Division, a member of ASEE Engineering Technology Division, a member of Institute of Electrical & Electronic Engineers (IEEE), and a member of Electrical and Computer Engineering Technology Department Heads Association (ECETDHA)