Computer Systems Technology AAS



Similar documents
SPEA CRIMINAL JUSTICE 3 RD YEAR REPORT MAY 2008

CIS Occupational Program Assessment Plan

Marshall Community & Technical College

ASSESSMENT REPORT CENTRAL NEW MEXICO COMMUNITY COLLEGE

Mechanical Engineering Program Annual Program Improvement Report

Computer Science Department s Student Outcomes Assessment Plan

STUDENT LEARNING ASSESSMENT REPORT

How To Learn To Use A Computer System

Professional Education Unit

Computer Network & ICT Support Technician

ENVIRONMENTAL HEALTH SCIENCE ASSESSMENT REPORT

Master of Science (MS) in Computer Science

MA in Sociology. Assessment Plan*

Application Packet. Insurance Billing & Coding Program. Insurance Coding and Billing Insurance Billing and Coding Program

Program Outcome Assessment Report. Criminal Justice Program. Department of Behavioral Sciences. Southeastern Oklahoma State University

ASSESSMENT REPORT COUNSELING AND GUIDANCE SCHOOL OF EDUCATION MASTER S DEGREE LEVEL MISSION AND GOALS

NORTHERN ILLINOIS UNIVERSITY. College: College of Business. Department: Inter-Departmental. Program: Master of Business Administration

SUNY College at Buffalo (BSC) Bachelor of Business Administration (BBA) Educational Effectiveness Assessment Plan. Version

Goals and Objectives

WASHBURN UNIVERSITY DUAL DEGREE PROGRAM School of Law (J.D.) & School of Business (M.B.A.)

The specific objectives of the MMIS program are as follows:

Information & Telecommunication Technology

Assessment Plan for the Computer Science Program. All courses are offered in a traditional lecture and independent laboratory format.

Maintaining a Quality Cumculum in Information Technology. at Nicolet Area Technical College using the. Wisconsin Technical College Systems

Information Technology

STUDENT LEARNING OUTCOMES ASSESSMENT ACTIVITIES Fall Semester 2001 through Spring Semester 2004 (past five semesters)

ASSESSMENT PLAN FOR MASTER OF SCIENCE IN TAXATION (M.S.T.) DEGREE OCTOBER 2012

Department of Construction

Assessment Coordinator: Bill Freese 214 Reid Hall

Pharmacy Technician Certification Program Application

Department of Management Information Systems Terry College of Business The University of Georgia. Major Assessment Plan (As of June 1, 2003)

Network Systems Administrator/Analyst

Opportunities. What is IT? CAREERS. solutions. Consider a Career in IT. training. a good job! resourceful and creative. information technology

Boise State University Department of Construction Management Quality Assessment Report and Action Plan

The Department Faculty approved the following student learning outcomes for the Undergraduate BA/BS major in April 2011.

Ch. 354 PROFESSIONAL EDUCATORS 22 CHAPTER 354. PREPARATION OF PROFESSIONAL EDUCATORS GENERAL PROVISIONS GENERAL CATEGORY PROGRAM DESIGN

Assessment Plans. for. Computer Engineering Programs

APPENDIX A WORK PROCESS SCHEDULE AND RELATED INSTRUCTION OUTLINE. Computer Support Specialist (Existing Title: Help Desk Technician)

ASU College of Education Course Syllabus ED 4972, ED 4973, ED 4974, ED 4975 or EDG 5660 Clinical Teaching

Wichita State University School Psychologist, Narrative Describing the Operation of the Program Assessment Plan Approved

RESEARCHING AND CHOOSING A SCHOOL

Criteria for Accrediting Computer Science Programs Effective for Evaluations during the Accreditation Cycle

College of the North Atlantic and its accreditation with the Accreditation Council for Business Schools and Programs (ACBSP)

DEPARTMENT OF COMPUTER SCIENCE

PERALTA COMMUNITY COLLEGE DISTRICT. Laney College Instructional Program Review Narrative Report CIS Department

ESUMS HIGH SCHOOL. Computer Network & Engineering (CNE) Syllabus

Technical Courses. Course Catalog

Guidelines for Massachusetts Early Educator Preparation Programs Participating in the Early Childhood Educator Scholarships Program.

SELF-STUDY FORMAT FOR REVIEW OF EXISTING DEGREE PROGRAMS

IPP Learning Outcomes Report

ACADEMIC REGULATIONS FOR MASTER'S DEGREE PROGRAMS

University of Alaska Anchorage A.A.S. Medical Laboratory Technology and BS Medical Technology Articulated Program. Educational Effectiveness

Program Guidebook Industrial and Organizational Psychology, M.A. - Chicago

BUSINESS MANAGEMENT TECHNOLOGY Plan for Assessment of Student Academic Achievement

1 Past AOL reports and reviews are available at

The University of Texas System Online Consortium Kinesiology Online Master s Degree Program Handbook of Operating Procedures

Brandman University. School of CCNA

Ratings Exceeds Expectations Meets Expectations Improvement Needed

WORK PROCESS SCHEDULE COMPUTER SYSTEMS ANALYST O*NET-SOC CODE: RAPIDS CODE: 2017HY. Work Process and Classroom Training Duties and Hours

STANDARD #2. Strategic Planning

Georgia Perimeter College Faculty Senate New Course

Cybersecurity AAS Program

PROGRAM PLANNING PART A: PROGRAM SUMMARY, UPDATE. DEPARTMENT OR PROGRAM TITLE: Computer Networking and Electronics (CNET)

Allegany College of Maryland. 239 Cisco Networking 2 * Offered Fall semester and

Healthcare Software Analyst Program Application Packet

Department of Political Science. College of Social Science. Undergraduate Bachelor s Degree in Political Science

Cisco Networking Academy Instructor Training Guidelines Revised January 2005

Program Assessment Report. Unit Psychology Program name: Clinical Psychology MA Completed by David Grilly May 2007

Programming and Software Development. Networking Systems

Northern Illinois University. Report to the University Assessment Panel Fall 2007

Ed.S. School Psychology Program Guidebook

Educators of Excellence Teacher Preparation Program Handbook

Chabot College Program Review Report. Check one: _X SLO Portion of Upcoming Program Review (Submitted May 2015 in Preparation for Oct 2015)

ELECTRICAL ENGINEERING GOALS AND OBJECTIVES AND ASSESSMENT PLAN

Georgia Perimeter College Faculty Senate Course Change

PETITION/PROGRAM SHEET Degree: Bachelor of Science Major: Computer Science

Electrical Engineering Technology(BS) and Computer Engineering Technology Assessment Plan

Administrative Services Operational Guidelines

M.A. Counseling Psychology Program Guidebook

Bloomsburg University. Computer Science Assessment Report

Student Learning Outcomes Assessment Summary for Undergraduate Programs

Summary Report for the Bachelor s Degree in Business Administration (All concentrations) for Academic Year

President s Message. Sincerely, Dr. Verna Magee-Shepherd President and Vice Chancellor

Southeastern Oklahoma State University. Graduate Studies Handbook. for. Master of Education Degree in Reading Specialist

College of Education Counselor Education Program Assessment Activities Report Spring 2008

FIRE SCIENCE PROGRAM

Alternate Route to Interim Teaching Certification Program Application

Saint Louis University Program Assessment Plan. Program Learning Outcomes Curriculum Mapping Assessment Methods Use of Assessment Data

Sociology Five Year Program Review/Planning Document December 2, Section I. Mission and Goals

University Mission School Mission Department Mission Degree Program Mission

University of Illinois Department of Computer Science Goals, Objectives, and Assessments January 2009

Rationale for changes to the MA in Communication Studies and to the MA in Communication Studies with an option in Community College Pedagogy

Requirements for a Graduate Degree (M.S. or Ph.D.) in Oceanography at the University of Maine

DEPARTMENT OF EDUCATION SCHOOL PSYCHOLOGY PROGRAM

Student Guide for the Ph.D. Degree Program in Educational Administration and Policy Studies

Study Regulations for the Master of Science Programs at MODUL University Vienna

Assessment of Learning Report. Computer Science Networking CPC. Fall 2008 Spring 2010

2009 MASTER PLAN/PROGRESS REPORT

2014 Construction Management Program Information

Specific Initiatives and Strategies In Strategic Plan And Level Of Accomplishment

Transcription:

Computer Systems Technology AAS Educational Effectiveness Assessment Plan Version 5 Adopted by: Matanuska-Susitna College CST Faculty, Brenda Forsythe Submitted to MSC Office of Academic Affairs: March 2008 Dennis Clark, Director, Matanuska-Susitna College: April 2008

TABLE OF CONTENTS Mission Statement 3 Program Introduction 3 Assessment Process Introduction 3 Program Outcomes 4-5 Table 1: Association of Assessment Measures to Program Outcomes 6 Table 2: Program Outcomes Assessment Measures and Administration 7 Assessment Implementation & Analysis for Program Improvement 8 Appendix A: Exam Score 9 Planning/Results Example Sheet 10 Appendix B: Assignment Performance 11 Planning/Results Example Sheet 12 Appendix C: National Certification Exams 13 Appendix D: Exit Survey 14 Exit Survey Example 15-17 Appendix E: Work Study 18 CST Forsythe MSC Committee Approved March 2008 Page 2 of 18

MISSION STATEMENT The program objective is the development of a well-trained workforce for the State of Alaska. An associate of Applied Science in Computer Systems Technology provides skills and education in the field of Network and Systems Administration. The degree is designed to teach students both the business and IT-related concepts needed to enter the workforce as a Systems Administrator and Technician. PROGRAM INTRODUCTION Matanuska Susitna College offers an Associates of Applied Science in Computer Systems Technology 1. This document defines the educational outcomes for the Computer Systems Technology program and outlines a plan for assessing the achievement of the stated outcomes. This plan has been developed to meet the guidelines of relevant industry standards and was developed in consultation with the Vice Provost of the Office of Academic Affairs and his staff to develop a means for demonstrating the effectiveness of the various programs. In addition, the development of the current Educational Assessment Plan has been a combined effort with the Program Coordinator of CST and the Mat-Su Academic Affairs Department. The program outcomes are the development of a well trained workforce for the State of Alaska. Since many jobs in the computer technology sector are predicted to grow at high rates in the coming decade, this degree program was designed to train essential employees for that sector. ASSESSMENT PROCESS INTRODUCTION An updated set of outcomes are presented in this Assessment Plan for the Assessment Cycle 2007-08. This Plan is a revision of the joint 2003 Educational Assessment Plan, which was a shared venture between Mat-Su College and Kodiak College. The original outcomes in 2003 were determined in discussions held by the faculty over the course of several months. The outcomes have since been revised in this document for the 2007-08 AY Assessment Cycle and beyond. The faculty met and accepted the original objectives, outcomes, and assessment processes on March 28, 2003. They have been amended in this current document on January 12, 2006. Additional edits, although few, have been made for AY 2007-2008. This document defines the expected student learning outcomes for the CST program and outlines a plan for assessing the achievement of the stated outcomes. 1 This program was originally co-developed by Matanuska-Susitna College and Kodiak College in 2003. It is offered at both campuses. This document assesses only the studies at Matanuska-Susitna College. CST Forsythe MSC Committee Approved March 2008 Page 3 of 18

PROGRAM OUTCOMES The Computer Systems Technology program has educational outcomes to produce graduates who will be able to: Demonstrate proficiency in operating system and utility software installation and configuration, in computer hardware and software management, and in network management. Exhibit mastery of necessary tools to create and manage Windows based network. Exhibit oral and written communication skills consistent with a career in Computer Science. Demonstrate abilities in critical thinking, problem solving and analysis skills software design. Demonstrate basic knowledge of core concepts in Computer Science, including algorithms, data structures, concepts of programming languages, operating systems, and computer organization and architecture. Demonstrate basic understanding of theoretical foundations of Computer Science including discrete mathematics, algorithm analysis, and computability. Program outcomes have been published in the current MSC Bulletin and are posted on the MSC web site. The Computer Systems Technology program has educational outcomes to produce graduates who: OUTCOME 1: Have sufficient technical competence to obtain employment in the area of Information Technology and to be able to progress professionally within the discipline. Student Learning Outcome #1 Students will be able to demonstrate proficiency in operating system and utility software installation and configuration. Student Learning Outcome #2 Students will be able to demonstrate proficiency in computer hardware and software management. Student Learning Outcome #3 Students will be able to demonstrate proficiency in network management. Student Learning Outcome #4 Students will be able to demonstrate the mastery of the tools necessary to create and manage a Windows based network. OUTCOME 2: Effectively communicate with both technical and non-technical colleagues. Student Learning Outcome #5 Students will have an understanding of basic customer service principles, including relationships, perceptions, telephone techniques, quality, ethics, record keeping, interpersonal relationships, and teamwork. OUTCOME 3: Are able to work effectively as a member of a team. CST Forsythe MSC Committee Approved March 2008 Page 4 of 18

Student Learning Outcome #5 Students will have an understanding of basic customer service principles, including relationships, perceptions, telephone techniques, quality, ethics, record keeping, interpersonal relationships, and teamwork. Student Learning Outcome #6 Students will have an understanding of the fundamental aspects and uses of project management in the workplace. OUTCOME 4: Are able to apply their knowledge and skills to create and operate information systems that add to the capabilities of business organizations. Student Learning Outcome #1 Students will be able to demonstrate proficiency in operating system and utility software installation and configuration. Student Learning Outcome #2 Students will be able to demonstrate proficiency in computer hardware and software management. Student Learning Outcome #3 Students will be able to demonstrate proficiency in network management. Student Learning Outcome #4 Students will be able to demonstrate the mastery of the tools necessary to create and manage a Windows based network. Student Learning Outcome #6 Students will have an understanding of the fundamental aspects and uses of project management in the workplace. OUTCOME 5: Demonstrate professional and ethical behavior in the workplace. Student Learning Outcome #5 Students will have an understanding of basic customer service principles, including relationships, perceptions, telephone techniques, quality, ethics, record keeping, interpersonal relationships, and teamwork. CST Forsythe MSC Committee Approved March 2008 Page 5 of 18

TABLE 1: ASSOCIATION OF ASSESSMENT MEASURES TO PROGRAM OUTCOMES Outcomes Exam Score Assignment Performance National Certification Exams Exit Survey Work Study 1. Demonstrate proficiency in operating system and utility software installation and configuration, in computer 1 1 1 1 1 hardware and software management, and in network management. 2. Exhibit mastery of necessary tools to create and manage 0 1 0 0 1 Windows based network. 3. Exhibit oral and written communication skills consistent 0 1 0 0 1 with a career in Computer Science. 4. Demonstrate abilities in critical thinking, problem solving and 1 1 1 0 1 analysis skills and software design. 5. Demonstrate basic knowledge of core concepts in Computer Science, including algorithms, data structures, concepts of 1 1 0 0 0 programming languages, operating systems, and computer organization and architecture. 6. Demonstrate basic understanding of theoretical foundations of Computer Science including discrete mathematics, 0 1 0 0 0 algorithm analysis, and computability. 0 = Measure is not used to measure the associated outcome. 1 = Measure is used to measure the associated outcome. CST Forsythe MSC Committee Approved March 2008 Page 6 of 18

TABLE 2: PROGRAM OUTCOMES ASSESSMENT MEASURES AND ADMINISTRATION Measure Exam Score Assignment Performance National Certification Exams Exit Survey Work Study Description Proficiency in program outcome demonstrated by final or certification preparation exam score. Proficiency in program outcome demonstrated by assignment or project. Results of MCSE Industry Certification Exams, CCNA Industry Certification Exams, A+ and Net+ Industry Certification Exams. Graduating students will be given an Exit Survey in which they can measure their accomplishments in the program against the program objectives. Survey employers/supervisors of Work Study to ascertain the quality of education & career preparation. Frequency/ Start Date Fall 2007- every semester Fall 2007- every semester Fall 2007- yearly Administered yearly beginning the spring of 2008 Fall 2007- every semester Collection Method Results from Exams Assignments, grading criteria, and results collected from instructors Results from Exams Hand delivered, mail, or electronic delivery Report from employer/ supervisor Administered by Course Instructors Course Instructors Vendor Authorized Testing Center Program Coordinator/ Assessment Staff Program Coordinator/ Assessment Staff CST Forsythe MSC Committee Approved March 2008 Page 7 of 18

ASSESSMENT IMPLEMENTATION & ANALYSIS FOR PROGRAM IMPROVEMENT General Implementation Strategy The Computer Systems Technology Program Director will work with the CST Department faculty and the Academic Affairs Department Assessment staff to collect the required data. This coordinator also provides support for course level assessment and other assessment activities as needed. The department coordinator will assemble all the data and prepare the final report for review by the program faculty. Method of Data Analysis and Formulation of Recommendations for Program Improvement The MSC Assessment Committee will review the data collected at least once a year. This will result in recommendations for program changes that are designed to enhance performance relative to the program s outcomes. The results of the data collection and the Educational Assessment Plan, interpretation of the results (Annual Report), and the recommended programmatic changes are to be forwarded to the office of Academic Affairs (in the required format) by the middle of May each year. The proposed programmatic changes may be any action or change in policy that the MSC Assessment Committee deems as being necessary to improve performance relative to the program s outcomes. Recommended changes should consider workload of faculty, staff, and students, budgetary constraints, facility constraints, and any other restriction that may apply. A few examples of changes made by programs at Matanuska-Susitna College include: changes in the courses including, but not limited to course content, scheduling, sequencing, prerequisites, and delivery methods, changes in faculty or staff assignments, addition and/or replacement of equipment, changes to facilities. Modification of the Assessment Plan The Program Director and MSC Assessment Committee, after reviewing the collected data and the processes used to collect it, may decide to alter the Educational Assessment Plan (EAP). Changes may be made to any component of the Plan, including the Outcomes, Assessment tools, or any other aspect of the Plan. Changes are to be initiated by the Program Director and approved by the MSC Assessment Committee. The modified EAP is to be forwarded to the MSC Academic Affairs department. 2007-2008 Assessment Cycle The Mat-Su College Computer Systems Technology Program Coordinator and full-time faculty position is vacant and undergoing a hiring search. It is assumed that the new Coordinator may want to alter this EAP significantly. This Plan was re-written during the spring of 2006 cycle and no data has been collected for the above-mentioned Assessment Tools save for the Outcome Surveys for previous years. The IDEA surveys implemented during F07 will not be mapped to program outcomes so they have not been included in this plan. CST Forsythe MSC Committee Approved March 2008 Page 8 of 18

APPENDIX A: EXAM SCORE Measure Description: Proficiency in program outcome demonstrated by final or certification preparation exam score. This tool has been identified as a measure because it is a measurable demonstration of course outcomes as outlined in the Course Content Guide. In turn, the course outcomes reflect the identified Program Outcomes. Factors that affect the collected data: Data could be negatively affected by inconsistencies in course instruction by different instructors. It is assumed that results should have good sampling as the exams are part of existing courses. How to interpret the data: Exam results are mapped to course outcomes. The results will be tallied without identifying student data. Results will be used to help instructors better meet the objectives of the Course Content Guide and to determine if there is a need to modify Course Content Guide objectives to reflect industry changes. Tabulating and reporting results: The tabulation database will be prepared by the Program Director and Assessment staff. The survey is administered by the Program Director. Assessment staff in Academic Affairs will receive results, tabulate, and format the analysis of the data and return to the Program Director for analysis in the Annual Report. Specific examples will be added to this document to illustrate the tool. CST Forsythe MSC Committee Approved March 2008 Page 9 of 18

MSC Student Learning Objective Planning/Results Sheet Semester/Year Subject Outcome # Instructor Course CRN Total Enrollment Total Number of Students Assessed Total Number Successful Total Number Unsuccessful Total Points or Percentage Possible Total Points or Percent Required for Success Description of assignment Please either list all individual student points or scores for this assessment OR attach a copy of your grade sheet without student names. Please use the back side to write any analysis or comments regarding this assessment tool, this course, or this program. Please attach a copy of: 1. The Assignment 2. A Successful assessment score sheet with the corresponding student product 3. An Unsuccessful assessment score sheet with the corresponding student product Please remove any student identifiers. CST Forsythe MSC Committee Approved March 2008 Page 10 of 18

APPENDIX B: ASSIGNMENT PERFORMANCE Measure Description: Proficiency in program outcome will be demonstrated by assignment or project score. This tool has been identified as a measure because it is a measurable demonstration of course outcomes as outlined in the Course Content Guide. In turn, the course outcomes reflect the identified Program Outcomes. Factors that affect the collected data: Data could be negatively affected by inconsistencies in course instruction by different instructors. It is assumed that results should have good sampling as the exams are part of existing courses. How to interpret the data: Exam results are mapped to course outcomes. The results will be tallied without identifying student data. Results will be used to help instructors better meet the objectives of the Course Content Guide and to determine if there is a need to modify Course Content Guide objectives to reflect industry changes. Tabulating and reporting results: The tabulation database will be prepared by the Program Director and Assessment staff. The survey is administered by the Program Director. Assessment staff in Academic Affairs will receive results, tabulate, and format the analysis of the data and return to the Program Director for analysis in the Annual Report. CST Forsythe MSC Committee Approved March 2008 Page 11 of 18

MSC Student Learning Objective Planning/Results Sheet Semester/Year Subject Outcome # Instructor Course CRN Total Enrollment Total Number of Students Assessed Total Number Successful Total Number Unsuccessful Total Points or Percentage Possible Total Points or Percent Required for Success Description of assignment Please either list all individual student points or scores for this assessment OR attach a copy of your grade sheet without student names. Please use the back side to write any analysis or comments regarding this assessment tool, this course, or this program. Please attach a copy of: 4. The Assignment 5. A Successful assessment score sheet with the corresponding student product 6. An Unsuccessful assessment score sheet with the corresponding student product Please remove any student identifiers. CST Forsythe MSC Committee Approved March 2008 Page 12 of 18

APPENDIX C: NATIONAL CERTIFICATION EXAMS Measure Description: National and vendor-specific exams are designed to ensure that all candidates have a consistent skill set. Exams are designed to ensure that candidates possess a minimum knowledge level in the area(s) of examination. The CST Program currently maps to the following vendor certification exams: Seven Microsoft certification exams, all of which are required to be recognized as a Microsoft Certified Systems Engineer. The Cisco CCNA exam is designed to ensure that all candidates have a given level of knowledge in networking and Cisco technology. If students pass this exam, they are awarded the Cisco Certified Network Associate (CCNA) certification. The CompTIA A+ exams test a candidate s knowledge of basic networking principles and concepts. Factors that affect the collected data: These exams are no longer administered by the in-house testing center at Mat-Su College. Cost of the exams results in a low participation rate by students. Industry exams are not mandatory; therefore, the data gathered is minimal. It may be a challenge to gather industry exam data. How to interpret the data: The pass/fail ratio of students taking the industry exams provides an overall look at teaching effectiveness within the Program. Test scores help the Program Director know whether the students successfully grasp the course outcomes. A review of the subcategory scores provides data on specific areas within a topic, which helps pinpoint the particular areas that need improvement. Tabulating and reporting results: The tabulation database will be prepared by the Program Director and Assessment staff. The survey is administered by the Program Director. Assessment staff in Academic Affairs will receive results, tabulate, and format the analysis of the data and return to the Program Director for analysis in the Annual Report. CST Forsythe MSC Committee Approved March 2008 Page 13 of 18

APPENDIX D: EXIT SURVEY Measure Description: The Exit Survey asks graduates of the CST program to rate their performance relative to the program s outcomes. Additionally, graduates are asked to rate the program s delivery of the material related to the outcomes from their viewpoint. A sample of the survey instrument is included on the following pages. Factors that affect the collected data: A number of factors need to be taken into consideration when analyzing the data. The following factors are those that we have identified. Low return rates reduce the accuracy of the result. Timing for the distribution of exit surveys can pose a problem. It will have to be determined if the end of the semester before graduation is the best time for obtaining a response. Since there may be one or two graduates in a particular course, exit surveys cannot be distributed during class time. It may be necessary for the Program Coordinator to connect with each individual graduate in order to disperse the survey. How to interpret the data: Care should be taken to investigate and discuss the factors influencing the results before interpreting the results. Results of the surveys should be compared against other indicators to compose an accurate reflection of program performance relative to the expected outcomes. Tabulating and reporting results: The survey is prepared by the Program Director and Assessment staff. The survey is administered by the Program Director. Assessment staff in Academic Affairs will receive results, tabulate, and format the analysis of the data and return to the Program Director for analysis in the Annual Report. Sample Survey A sample of the exit survey has been provided on the following pages. CST Forsythe MSC Committee Approved March 2008 Page 14 of 18

Matanuska-Susitna College CST Graduate Exit Survey Matanuska-Susitna College is continually seeking to improve our programs to provide our graduates with the best possible knowledge and skills to be successful. Your candid and thoughtful feedback to the following questions is essential to our objective. We are asking for your name and current address so we can thank you for responding and maintain our ability to remain in contact with you. However, your name and address will be separated from the survey BEFORE any tabulation is made. Be assured your responses and comments will be presented only as aggregate summaries and no individual will be identified. Name: Address: Email: CST Forsythe MSC Committee Approved March 2008 Page 15 of 18

Mat-Su College CST Graduate Exit Survey 1. Gender: Male Female Age: 20-24 25-29 30-39 40-49 50-59 60+ EMPLOYMENT STATUS 2. Are you currently employed? Yes No If No, are you seeking current employment? Yes No 3. If yes, are you working in the field in which you received your training? Yes No 4. In what fields are you employed or seeking employment? 5. What type of organization do you work for? Federal School District Utilities Borough Medical Small Private State Construction Legal Other 6. Have you had additional training since receiving your degree / certificate from Mat-Su? Yes No 7. If yes, why did you seek additional training? Requirement of your current position Seeking promotion or advancement in your current job Seeking another job Personal enrichment / professional growth This program had a series of specific outcomes we expected to attain with you. Please rate the extent you believe each of the following was accomplished: RATING THE PROGRAM 1 Demonstrate proficiency in operating system and utility software installation and configuration. 2 Demonstrate proficiency in computer hardware and software trouble shooting. 3 Demonstrate proficiency in network set up, configuration, certification and troubleshooting. 4 Are able to install and configure various operating systems including, Windows 2000 and any other current operating system. 5 Develop hardware/software upgrade techniques sufficient to keep pace with advances in technology. 6 Know the proper use of Windows 2000 security tools. Excellent (4) Good (3) Fair (2) Poor (1) Not Applicable 7 Will be able to analyze the various types of network technologies and topologies to determine the proper type to use in a given situation. 8 Demonstrate the mastery of the tools necessary to create and manage user accounts in a Windows 2000 workgroup and/or domain. CST Forsythe MSC Committee Approved March 2008 Page 16 of 18

RATING THE PROGRAM 9 Demonstrate the mastery of the tools necessary to create and manage group accounts in a Windows 2000 workgroup and/or domain. 10 Are able to identify, design, and implement a network services management strategy. 11 Demonstrate router interfacing, command line editing, startup, setup, and configuration. 13 Demonstrate proficiency in the management of Local Area Networks (LANs). 14 Demonstrate proficiency in networking technology and practices. 16 Have an understanding of basic customer service principles, including relationships, perceptions, telephone techniques, quality, ethics, record keeping, interpersonal relationships, and teamwork. 19 Have an understanding of the relationship between the effective use of project monitoring and evaluation and the success of a project. 20 Have an understanding of the foundations of business, including profit, issues of social responsibility, and forms of ownership. 21 Have an understanding of the fundamental aspects and uses of project management in the workplace. 22 Have the ability to research, discover, and/or use the various types of project management software, who are familiar with the interface and features available. 23 Ability to seek information and analyze data to problemsolve 24 Ability to verbally communicate effectively 25 Ability to work effectively as a member of a team 26 Ability to express ideas in writing 27 Ability to solve computation/mathematical problems Excellent (4) Good (3) Fair (2) Poor (1) Not Applicable RATING THE UNIVERSITY SUPPORT SERVICES Excellent Good Fair Poor Quality of the Computer Systems Technology Department instruction Quality of Mat-Su College academic advising Computer Systems Technology Departmental support staff helpfulness What improvements would you recommend to make this Mat-Su College program better? Thank you again for your candid and thoughtful responses. Thank you for your participation in this survey. We truly strive to increase the quality and effectiveness of the Computer Systems Technology Program at MSC. If you have any questions or comments about this survey, please contact the Program Coordinator at cst@matsu.alaska.edu. CST Forsythe MSC Committee Approved March 2008 Page 17 of 18

APPENDIX E: WORK STUDY Measure Description: The survey asks employers to rate the performance of students who have completed a Work Study course and placement as part of the CST Program. Employers are asked value questions targeting the objectives of the program and then are asked how well the MSC student does in relation to this objective. Additionally, employers are asked to rate the importance of the program objectives from their viewpoint. Factors that affect the collected data: A number of factors need to be taken into consideration when analyzing the data. The following factors are a few that have been identified. Not all students will be placed in a Work Study situation. It is one of two courses that fulfill the requirement for the CST AAS. The data for participating employers and students should show some consistency and value as they are structure within the course Work Study situations and outcomes will vary How to interpret the data: Care should be taken to investigate and discuss the factors influencing the results before interpreting the outcome. The results of the surveys should also be compared against alumni surveys to get good picture of program performance. Be aware that there is not a direct connection in the two surveys between the employers and the alumni that work for them. Tabulating and reporting results: The tabulation database will be prepared by the Program Director and Assessment staff. The survey is administered by the Program Director. Assessment staff in Academic Affairs will receive results, tabulate, and format the analysis of the data and return to the Program Director for analysis in the Annual Report. CST Forsythe MSC Committee Approved March 2008 Page 18 of 18