Outcome 1 Network Administration Students will be employed as network support or technical support professionals. Results Measurable Criteria 80 % of graduates will be employed in a technical support or network administration position within a year after graduation Measurement Tool Graduate Surveys Time Frame Summer 2010. Summer 2011 Analysis and Action Outcome 2 Measurable Measurement Time Frame Criteria Tool Employers will indicate satisfaction with graduates skills obtained at SPSCC. 80 % of employers of SPSCC graduates will indicate our Employer surveys Spring 2010 Spring 2011 graduates have good to excellent entry-level skills Results: There were not enough employer satisfaction surveys completed to conduct a valid analysis.
Analysis and Action: The faculty will continue to provide the Institutional Research office with employer contact information, which will then be used by Institutional Research to send out employer satisfaction surveys. Outcome 3 Network Administration Students will demonstrate their skills designing and implementing a client-server network. Measurable Criteria 70% of students will pass the capstone course CNA 296 with a C+ or better, where they will design and implement a clientserver network as a member of a design and implementation team. Measurement Tool Project Evaluation Peer Evaluation Time Frame Spring 11 Students in the CNA296 capstone course were responsible for assessing 80% of the final applied grade. Grading methods were completely anonymous and submitted in sealed blank envelopes. Project notebooks compiled over a 10 week period carried a 20% weighting evaluated by the instructor. Student grading categories were as follows: Peer Teamwork and Communication: 25% Communication is a vital factor of all interpersonal interaction and especially that of a team. Team members must be able to articulate their feelings, express plans and goals, share ideas and see each other s viewpoints. This category assesses an individual s ability to communicate in a successful manner. Students will assess the communication and listening skills of other team members. Teamwork is equally important, it describes the actions of individuals, brought together for a common purpose or goal, in which group needs override the needs of any one individual. In essence, each person on the team puts aside his or her individual needs to work towards the larger group objective. Peer Attendance: 5% Participation in team projects will require individuals to attend regular classes. Peer Effort: 20% Individual peers will be assessed in regard to their willingness to participate in goals set by project coordinators. Sometimes it is easier to rate this category with percentages. Did the individual give a hundred percent? If not, what percentage did they give? Grade accordingly. Peer Technical Skills: 30% Individual technicians will be assessed for problem solving and application of technical skills in regard to group goals. Did the individual use the
technical resources at their disposal to solve issues and further team goals. Loss of time due to technical errors and poor effort may be considered in this category. 100% of all students met outcome requirements of a C+ or higher grade. See chart following: Analysis and Action: Results were desirable and assessment outcome was achieved. However, self-grading methods may not be stringent enough to reflect desirable grading outcomes. More stringent weighting will be applied to project notebooks to help offset student applied grades and procure a more realistic grading bell curve.. Outcome 4 Measurable Criteria Measurement Tool Time Frame
Students are prepared for advanced coursework. 85 % of students will pass an examination on prerequisite materials. Use tests to assess the CNA 121 course in CNA 122, CNA 122 in CNA 221, CNA 221 in CNA 231, CNA 233 in CNA 234, CNA 101 in CNA 250 and CNA 250 in CNA 251. Winter 2010 through Spring 2011 Results: CNA 122 and CNA 221 were assessed in 2009 and 2010. The following table lists the results of the assessment tests for CNA 122 and CNA 221. The other classes were not assessed. Class Assessed: Assessment test results: Average Assessment Test Score CNA 122 Spring 2009 CNA 221 Fall 2009 CNA 122 Spring 2010 CNA 221 Fall 2010 One student out of 14 tested failed the assessment test for a 92.9% success rate. One student out of 23 failed the assessment test for a 95.6% success rate. All students passed. The minimum score was 63.5% and the maximum score was 97.5% All students passed. The minimum score was 66.0% and the maximum score was 100.0% 77.6% 83.2% 79.9 82.1 81.5% 83.8 79.9% 82.1% Average Course Percent of Total Points: Other Results: I decided to determine if there was any correlation between class performance and assessment test results. The following graphs show the class performance as a raw percentage of the total points for the class plotted with the assessment test results. The only class to show any degree of correlation was the Spring CNA 122 class (R 2 = 0.84). Otherwise, there was no way to predict the assessment test outcome by a student's performance in the course.
Analysis and Action: We exceeded the criteria of this assessment item for every measurement. The criteria stated " 85 % of students will pass an examination on prerequisite materials." 92.9% of the students who took CNA 122 in 2009 passed examinations on their prerequisite material. 95.6% of the students who took CNA 221 in 2009 passed examinations on their prerequisite material. All of the students in both courses (CNA 122 and CNA 221) passed their examinations in 2010. This is not that surprising, as students are reinforcing their skills from the prerequisite courses during the quarter they take their assessment tests. The poor correlation between course scores and assessment test results were somewhat surprising. During previous assessments, we found there was a strong correlation. We are a sure as to why this happened, but the sample size and other unknown variables may be responsible for the lack of correlation. The testing method we used for the assessment test was exactly the same as the testing method we used in the courses. However, during the courses, we also evaluate student written assignments and hands on projects. These usually improve the student s overall course score (their test scores are normally lower). We can infer that even though their assessment test scores are lower than their overall course scores, they actually show a higher level of competancy based on testing alone. We hope to develop some more specific methods to assess our students based on the skills we impart to them in the future.