Experiences in Developing a Web-based Assessment System



Similar documents
Secure A Guide for Users

Session Administration System (SAS) Manager s Guide

IISADMPWD. Replacement Tool v1.2. Installation and Configuration Guide. Instructions to Install and Configure IISADMPWD. Web Active Directory, LLC

Create Reports Utilizing SQL Server Reporting Services and PI OLEDB. Tutorial

Microsoft Windows SharePoint

Outlook Web Access End User Guide

New Participant Digital Certificate Enrollment Procedure

PREMIUM MAIL ADMINISTRATOR GUIDE

CISCO SECURE MAIL. External User Guide. 1/15/15 Samson V.

SCADA Security. Enabling Integrated Windows Authentication For CitectSCADA Web Client. Applies To: CitectSCADA 6.xx and 7.xx VijeoCitect 6.xx and 7.

Forumbee Single Sign- On

Blackboard Version Interactive Tools Contents

Terminal Four. Content Management System. Moderator Access

Matrix of Graduate Program - MBA Student Learning Outcomes for Public Disclosure Identify Each Intended Outcome

BLACKBOARD BASICS for ONLINE CLASSES & Classes with Online Components

New Online Banking Guide for FIRST time Login

Discussion Board. Accessing Discussion Board

Assessment of Learning Report Computer Science CPT Fall 2008 Spring I. Introduction and Background

INTRODUCING ORACLE APPLICATION EXPRESS. Keywords: database, Oracle, web application, forms, reports

1. Open the preferences screen by opening the Mail menu and selecting Preferences...

Manual. Netumo NETUMO HELP MANUAL Copyright Netumo 2014 All Rights Reserved

Dashboard Admin Guide

If you have questions or find errors in the guide, please, contact us under the following address:

Virtual Code Authentication User s Guide. June 25, 2015

A.Team Software (.DMS) Dynamic Meeting Scheduler Vision Document

Blackboard Version Interactive Tools

Why should you participate in the Connect-ED notification system?

How do I begin the application process? The application process has four steps:

GETTING STARTED WITH SQL SERVER

Stoneware Inc. Hyland Software OnBase. Stoneware, Inc.

VIEWING GT COURSES GUIDE

DDN CUSTOMER SUPPORT COMMUNITY QUICK START GUIDE

Introduction. Editions

Internet Explorer Security Settings. Help Sheet. Client Services. Version 4 Definitive 21 July 2009

Software Engineering I CS524 Professor Dr. Liang Sheldon X. Liang

User Guide for eduroam

Angel Learning Management System

Entrust Managed Services PKI Administrator Guide

Why Use Blackboard Content System to Store Documents One Time

Reference Guide for WebCDM Application 2013 CEICData. All rights reserved.

Remedy ITSM Service Request Management Quick Start Guide

For further support information, refer to the Help Resources appendix. To comment on the documentation, send an to

Hamline University Administrative Computing Page 1

MSGCU SECURE MESSAGE CENTER

The Welcome screen displays each time you log on to PaymentNet; it serves as your starting point or home screen.

Creating & Managing Discussion Forums

2. Use your SWC server ( ) username and password.

SLOAC Users Manual. How to Document and Edit a Student Learning Outcomes Assessment Cycle

Hosted Microsoft Exchange Client Setup & Guide Book

HP Service Manager. Service Request Catalog (SRC) Tips & Tricks Document

Software Requirements Specification

Graduate Student Database Project

QUICK START GUIDE MONDOPAD/WIN

Using Internet or Windows Explorer to Upload Your Site

RT Support Ticket System

Outlook Profile Setup Guide Exchange 2010 Quick Start and Detailed Instructions

BUILDER 3.0 Installation Guide with Microsoft SQL Server 2005 Express Edition January 2008

HOW TO RUN A CAPP DEGREE EVALUATION. 1. From the FSU Homepage ( click Faculty & Staff

Contents. VPN Instructions. VPN Instructions... 1

Alfresco Online Collaboration Tool

CHARTER BUSINESS custom hosting faqs 2010 INTERNET. Q. How do I access my ? Q. How do I change or reset a password for an account?

PORTLANDDIOCESE.ORG - How to Connect Table of Contents

4 - TexShare and HARLiC CARDS ( Online Application Form) 5 REMOTE ACCESS TO DATABASES

NetWrix Account Lockout Examiner Version 4.0 Administrator Guide

Project Plan Log Monitoring Compliance

Lepide Active Directory Self Service. Configuration Guide. Follow the simple steps given in this document to start working with

Student Guide to Using Blackboard Academic Suite ver. 8.0

Recommended Browser Setting for MySBU Portal

User Guide for Archie and Review Manager 5. The Cochrane Airways Group

Cognos 10 Getting Started with Internet Explorer and Windows 7

GCI Conference Bridge 2

To open and/or save an attachment:

How do I provide patients with access to Instant Medical History?

Reporting. Understanding Advanced Reporting Features for Managers

PaperClip. em4 Cloud Client. Setup Guide

Restoring Sage Data Sage 200

Netteller: Online Banking User Guide

Advising/Early Registration - Student Procedures. Required Courses. Course Name Other Information Advisor Initials. Other Courses

NeoMail Guide. Neotel (Pty) Ltd

When your UA ID and PIN are filled in, click the Login button or press the Enter key.

Entrust Managed Services PKI Administrator s Quick Start Guide

Kramer Electronics, Ltd. Site-CTRL and Web Access Online User Guide (Documentation Revision 2)

PREMIUM MAIL USER GUIDE

Access your Insurance Agent s web site using the URL the agency has provided you. Click on the Service 24/7 Link.

How to Configure Your Client for HughesNet

Using CRM. Quick Reference. August 4, CRM Use for Agents Page 1 of 31

Instructions for Microsoft Outlook 2003

Last Revised: 2/16/2010. Microsoft Office SharePoint 2007 User Guide

PaperClip. em4 Cloud Client. Manual Setup Guide

Transcription:

Experiences in Developing a Web-based Assessment System Troy Harding Engineering Technology Department Computer Systems Technology Kansas State University Salina Abstract Like many departments around the country, the Engineering Technology Department at Kansas State University Salina is trying to find ways to effectively manage assessment of its programs. Students in the Web Development Project course were assigned the task of developing a prototype assessment system to manage and track student learning outcomes. In addition, the specifications called for a way to track suggestions for program and course improvement. This paper describes the project and its challenges. Introduction The Engineering Technology Department at Kansas State University is moving to more of an outcome-based model for its degree programs. The biggest motivation behind this move comes from two accreditation agencies. The North Central Association provides accreditation for K- State University as a whole. In addition, most of the programs in our Engineering Technology Department are accredited by Technology Accreditation Commission of the Accreditation Board for Engineering and Technology (TAC/ABET). 1 Both of these agencies emphasis the need for a student learning assessment plan that specifies the desired outcomes that graduates should achieve. Furthermore, there should be a trail of documentation to demonstrate what was done to meet these outcomes and how well the outcomes were met. The documentation should show a commitment to a continuous improvement process. 2 Within our department s student learning assessment plan, degree program student learning outcomes (SLOs) are connected to department SLOs, which are ultimately connected to university SLOs. 3 The university SLOs are the most broad type and program SLOs are the most specific type. A graduate fulfilling a set of program SLOs will, by default, also fulfill the department and university SLOs. In addition, TAC/ABET requires all graduates in accredited programs to meet specific program criteria.3 This makes it important that we are able to demonstrate how our program SLOs map to TAC/ABET s criteria. In addition to the different types of SLOs and criteria, we also have course outcomes to add to the mix. The course outcomes will map to the program SLOs. Tracking course outcomes allows us to see where in our curriculum we are meeting a specific program SLO. From this program SLO we should then be able to see where in our curriculum we are covering a particular university SLO and/or TAC/ABET requirement. Of course there is much more to assessment than just the pieces that are described above. As we continue to build our department assessment plan, we will need to track more pieces of information. For example, there will be such things as the suggestions, measures, criteria, and improvements that are part of a continuous improvement process.

A department assessment committee was formed with representatives from each of the different program areas. As the committee grappled with how to document the assessment process, it was suggested that we might try building a database application to store the information. The eventually led to the idea of making this prototype assessment system a student project. In the Spring 2004 semester the best fit for this type of project was my Web Development Project course. This a capstone course in the Web Development Technology Associate Degree. One of the big challenges for the committee was to come up with system specifications for an assessment plan that was still in its early stages of development. System Specifications The department assessment committee specified that the system should include the following input forms: Student Learning Outcomes (SLOs) for University SLOs for College SLOs for Engineering Technology (ET) Department SLOs for each academic program option (ET-CET, ET-MET, ETB-MET, ET-ECET, ETB- ECET, ET-CMST, ETB-CMST, ET-CWDT) SLOs for each course or activity Measurement methods for each course or activity SLO Suggested program and process improvements prompted by advisory committee, faculty, etc. Implemented improvements prompted by advisory committee, faculty, etc. The following output reports were specified: SLOs for University SLOs for College SLOs for ET Department SLOs for each academic program option (ET-CET, ET-MET, ETB-MET, ET-ECET, ETB- ECET, ET-CMST, ETB-CMST, ET-CWDT) Chart summarizing the University, College, Department and program SLOs (one for each program option). Chart matching program SLOs to TAC/ABET criteria SLOs for each course Tabular list and matrix of program SLOs vs. courses Chart showing measurement methods, improvements, etc. for a selected academic program option (ET-CET, ET-MET, etc.) Course cover sheet for each course showing which Program SLOs are addressed in the course List of implemented program and process improvements. Query must include date ranges, program option(s). List of suggested program and process improvements. Query must include date ranges, program option(s). In addition the committee added the following access and security notes: Authorized users will be able to access the system from any location with web access. The Department Head will have read and write access to all database areas.

Program coordinators will submit program option SLO revisions to the Department Head, who will review them and enter them into the database. Faculty will have read access to all areas and write access to their respective course areas. Other authorized individuals (administrators, accrediting agency personnel, etc.) will be given read access only. The Department Head will screen suggestions for program or process improvement before posting them for viewing by faculty or other authorized individuals. The students met with the department head and each of the assessment committee members to better understand the requirements and expectations. Project Description The prototype system is built on a Microsoft Windows server platform running Internet Information Services. The programming is done in Active Server Pages (ASP) and uses Microsoft Access for the database. On t he client side, most current web browsers should work as long as cookies are accepted. The system was tested with current versions of Microsoft Internet Explorer, Netscape, and Mozilla. The interface has a banner along with buttons across the top of the window. The buttons provide access to different areas of the system (see figure 1.) Once a particular area is accessed a menu is displayed on the left side of the window. A login form and/or login information is also provided on the left panel. Figure 1

A challenged based user and access security authentication system is used. In this type of system each web page that is requested will check to make sure the user is logged in and has proper permission to access the page. The user database for controlling access to the system is encrypted and separate from the rest of the assessment data. There are two levels of users supported by the security system. Level 1 users are the faculty users. These type of users have the ability to add SLOs at the program and course level. In addition, they can also run all the assessment reports. Level 3 users have administrator privileges. These users have all the level 1 capabilities plus the ability to add other users and modify user information. A forum area is provided to track suggestions and comments for program and process improvement (see figure 2). The students modified an open source bulletin board system, Philboard 4, and integrated it into the assessment system. This allows users to post suggestions and make comments on previously posted comments. Before the post will be public viewable it must be approved by a level 3 administrator. Anytime a new post is made to the forum an email message is sent to a designated moderator notifying them of the entry. The moderator can login with their administrator account then edit and approve the post. Users can add comments and suggestions to current posts by replying in that same thread. In addition, users can start new threads when desired. Administrative users also have the ability to create whole new forums within the bulletin board system. Figure 2 The system also provides menus for inputting, editing, and reporting SLOs (see figure 3.) Most of the options on the menus provide a description of the necessary input fields. In addition, there is a separate help area that provides information about the different areas and the system in general. Challenges Students ran into several major challenges during the development of the system. Initially the students had to spend a significant amount of time just learning what this system was supposed to do and all the terminology associated with assessment. The fact that our department is still in the early stages of developing an assessment plan added to the challenge. Many times students

would ask a question about something that had yet to be hashed out by the assessment committee. This sometimes led to conflicting information and consequently confusion amongst the students. Though the specifications listed above didn't change throughout the semester, the students understanding and approach changed a lot between first few weeks of the semester to the second half of the semester. The biggest impact of this was on the database design. The students spent a great deal of time designing and redesigning the database in the first half of the semester. This greatly affected the amount of time they had left to do the programming for the rest of the system. With the constraints of working within a semester time frame they had to scale back somewhat on the scope of what the had hoped to do. Another factor that was a significant problem for the students was communication. There were 12 students in the class. These students divided into three teams to work on different aspects of the system. The students had to learn to work together and coordinate their activities. Again, different levels and understanding coupled with different approaches and lack of effective communication between team members and teams led to a lot of inefficiencies the first half of the semester. However, by the second half of the semester the students did a very good job of coordinating their activities. Figure 3 Conclusion The prototype system developed by the students still has a lot of rough edges and needs some refining. However, it provides a good base to build on. It shows a lot of promise as a tool to help faculty track and document the assessment process, which will hopefully become part of a continuous improvement process.

References 1. Criteria for Accrediting Engineering Technology Programs, Technology Accreditation Commission of the Accreditation Board for Engineering and Technology, November 1, 2003, http://www.abet.org/images/criteria/t001%2004-05%20tac%20criteria%201-19-04.pdf, accessed on June 18, 2004. 2. School Improvement Plan, North Central Association Commission on Accreditation and School Improvement, http://www.ncacasi.org/standard/postsec/sip, accessed on June 18, 2004. 3. Basic Elements of an Assessment of Student Learning Plan, Kansas State University Assessment and Program Review, http://www.k-state.edu/apr/learning/asl.htm, accessed on June 18, 2004. 4. Philboard, http://www.youngpip.com/philboard.asp, accessed on March 29, 2004. Biographical Information TROY HARDING Assistant Professor, Computer Systems Technology, Kansas State University Salina (tdh@ksu.edu)