Florida State College at Jacksonville



Similar documents
ASSOCIATE DEAN, INSTRUCTION

YSU Program Student Learning Assessment Report Due Date: Thursday, October 31, 2013

Strategic Planning and Institutional Effectiveness Guide

Georgia State University

Assessment METHODS What are assessment methods? Why is it important to use multiple methods? What are direct and indirect methods of assessment?

Institutional Assessment Plan. Chicago State University

California State University, Stanislaus Business Administration (MBA) ASSESSMENT REPORT and PLAN

History Graduate Program Handbook

Strategic Plan San Luis Obispo County Community College District

Dr. Ryan McLawhon Texas A&M University

DRAFT * Statewide Strategic Plan for Science * DRAFT DRAFT * For Board of Regents Discussion * DRAFT. Mission

The Final Assessment of A Framework to Foster Diversity at Penn State: The Smeal College of Business Administration

PASCO-HERNANDO COMMUNITY COLLEGE PROGRAM REVIEW TEMPLATES FOR EDUCATIONAL PROGRAMS COMMUNITY/PUBLIC SERVICE

Department of Accounting, Finance, & Economics

RIO HONDO COMMUNITY COLLEGE DISTRICT DEAN, HEALTH SCIENCES

Student Organization System: An Online Student Organization Management System

Mott Community College Job Description

QUALITY ASSURANCE POLICY

Complete College Ohio COTC Completion Plan

Engaging Community FACULTY AND STAFF PROFESSIONAL DEVELOPMENT TASK FORCE

Lord Fairfax Community College ACCOMPLISHMENT OF Goals in Support of Achieving Dateline 2009 Report Date: June 2005

EVALUATION OF DEPARTMENT CHAIR

Guidelines for Addressing Distance and Correspondence Education

The University of Southern Mississippi. Detailed Assessment Report As of: 10/02/ :44 AM CDT Forensics MS

2. Case conceptualization proficiency a. Completion of capstone project

FACULTY DEVELOPMENT AND INSTRUCTIONAL DESIGN CENTER

ASSOCIATION FOR GENERAL AND LIBERAL STUDIES 2008 AGLS Awards for Improving General Education: Effective Program Processes

2009 MASTER PLAN/PROGRESS REPORT

North Carolina A&T State University

Educational Leadership

Graduate Student Handbook of the Mathematics Department

Graduate Handbook of the Mathematics Department. North Dakota State University May 5, 2015

BUDGET ADMINISTRATOR JOB DESCRIPTION

REPORT 2016/066 INTERNAL AUDIT DIVISION. Audit of management of technical cooperation projects in the Economic Commission for Africa

Human Resources Division Cornell University. Strategic Plan

MOE Online Class Quality Guidelines

Pima Community College Strategic Planning. Framework and Process, May 12, 2016

McNeese State University. Academic Program Review. Standards for Graduate and Undergraduate Programs

Guidance - Workload Allocation Planning

COLLABORATIVE ACADEMIC ARRANGEMENTS: POLICY AND PROCEDURES. - Policy -

GOAL I - Help students successfully obtain their diverse educational goals

Institutional Effectiveness Model

TITLE 135 PROCEDURAL RULE WEST VIRGINIA COUNCIL FOR COMMUNITY AND TECHNICAL COLLEGE EDUCATION

Gettysburg College. Co-Curricular Learning Assessment Plan. Subcommittee on Learning Outside the Classroom. June 15, 2003.

The University of Mississippi School of Education

DOCTORAL DEGREES ADMISSION REQUIREMENTS

Date Submitted: October 1, Unit: The Office of Institutional Research, Planning, and Effectiveness I. MISSION STATEMENT

Healthy People 2020 and Education For Health Successful Practices in Undergraduate Public Health Programs

How To Get A Ph.D. In Sport Management At Trotson University

PA State System of Higher Education Board of Governors

Our Lady of Holy Cross College Institutional Effectiveness Plan for Academic Departments

Guide to Doctoral Study in Special Education. Department of Disability and Psychoeducational Studies The University of Arizona Tucson, AZ 85721

DEPARTMENT OF ACCOUNTANCY ASSESSMENT PLAN FOR MASTER OF SCIENCE IN TAXATION (M.S.T.) DEGREE

Conceptual Framework. A. Overview and Conceptual framework

Counseling and Student Success

Co-operative Education and Internship Handbook. Revised April 20, 2016

Partnering with Academic Partners to Enhance Student Experiences and Transitions into Practice

Compensation Reports: Eight Standards Every Nonprofit Should Know Before Selecting A Survey

Graduation Requirements

Counseling and Student Success

Ph.D. in Art History Academic Assessment Plan

JOB PERFORMANCE APPRAISAL Monroe County Community College Administrators. Name: Position: Supervisor: Evaluation Period:

SCHOOL OF NURSING Baccalaureate Study in Nursing Goals and Assessment of Student Learning Outcomes

Part 86, Drug-Free Schools and Campuses Regulations Compliance Checklist Texas Christian University Biennial Review 2012

THE ADMINISTRATIVE UNIT ASSESSMENT HANDBOOK

Now accepting application for the Fall Are you an artist, designer, storyteller or content producer?

DEGREE REQUIREMENTS & ACADEMIC GUIDELINES

PERALTA COMMUNITY COLLEGE DISTRICT April 5, 2013 CLASSIFIED MANAGEMENT JOB DESCRIPTION

Mississippi Institutions of Higher Learning ACADEMIC GUIDELINES

Tallahassee Community College Foundation College Innovation Fund. Program Manual

PSYCHOLOGY. Master of Science in Applied Psychology

Policies and Procedures for Undergraduate Certificate Programs

Transcription:

Florida State College at Jacksonville 2012-2013 Curriculum Services Mission / Purpose The Curriculum Services Office is committed to providing the highest quality of service to faculty, administrators, and staff. The office advises and assists faculty proposal originators, dean(s), and other instructional supervisors through the process with overall development of the curriculum proposal, specific instructions on procedures, and submission to ensure compliance with College and State guidelines. In addition to facilitating the college-wide Curriculum Committee, the Curriculum Services Office maintains and updates the College Catalog, Orion Course Dictionary, and Statewide Course Numbering System (SCNS). The Curriculum Office supports the college's mission to provide the highest quality programs and courses to ensure a positive experience for students. Other Outcomes/Objectives, along with Any Associations and, Targets, Findings, O/O 1: Process Curriculum Proposals The Curriculum Services Office will efficiently process curriculum proposals that are submitted. M 1: Log and Track Curriculum Proposals During the academic year, from September through June, a log will be maintained of all received curriculum proposals signed by Campus Presidents identifying date received. Eighty percent of curriculum proposals will be finalized and submitted to the Vice President/Provost, State College Division, for approval within 90 days. M 2: Curriculum Services Survey By the eighth week of the Spring 2013 semester, College faculty, administrators, and staff will be asked to respond to a Curriculum Services Survey that contains at least two questions about the timeliness of services received (i.e., assistance with preparing proposals, feedback on technical review of proposals). Created in Survey Monkey, the instrument will be disseminated via an email message that contains a Web link. Staff members in the Offices of Curriculum Services and Academic Foundations will analyze the results of the survey in the Spring 2013 semester. Results will be disseminated to members of the College community via an electronic report about the efficiency of Curriculum Services' feedback on curriculum proposals. O/O 2: Systems Input The Curriculum Services Office will input all newly approved curriculum into various systems in timely manner (i.e. College Catalog, Orion Course Dictionary, Outline, SCNS). M 3: Log and Track Input (2011-2012) The Curriculum Services Office will log and track input to College Catalog, Orion Course Dictionary, Outline and SCNS. The Curriculum Services Office will complete all systems input within five business days of final administrative approval. M 4: Curriculum Services Survey By the eighth week of the Spring 2013 semester, College faculty, administrators, and staff will be asked to respond to a Curriculum Services Survey that contains at least two questions about the timeliness of systems input that is, the logging and tracking of information into the College Catalog, Orion Course Dictionary, Outline, and SCNS. Created in Survey Monkey, the instrument will be disseminated via an email message that contains a Web link. Staff members in the Offices of Curriculum Services and Academic Foundations will analyze the results of the survey in the Spring 2013 semester. Results will be disseminated to members of the College community via an electronic report. about the timeliness of Curriculum Services' systems input.

O/O 3: Customer Service The Curriculum Services Office will provide excellent customer service regarding curriculum development, approval and communication college-wide. M 4: Curriculum Services Survey By the eighth week of the Spring 2013 semester, College faculty, administrators, and staff will be asked to respond to a Curriculum Services Survey that contains at least two questions about the timeliness of systems input that is, the logging and tracking of information into the College Catalog, Orion Course Dictionary, Outline, and SCNS. Created in Survey Monkey, the instrument will be disseminated via an email message that contains a Web link. Staff members in the Offices of Curriculum Services and Academic Foundations will analyze the results of the survey in the Spring 2013 semester. Results will be disseminated to members of the College community via an electronic report. about the quality of services provided by the Office of Curriculum Services. M 5: Curriculum Services Survey By the eighth week of the Spring 2013 semester, College faculty, administrators, and staff will be asked to respond to a Curriculum Services Survey that contains at least two satisfaction questions about Curriculum Services (i.e., helpfulness of feedback during proposal development, professionalism of staff members, quality of technical review). Created in Survey Monkey, the instrument will be disseminated via an email message that contains a Web link. Staff members in the Offices of Curriculum Services and Academic Foundations will analyze the results of the survey in the Spring 2013 semester. Results will be disseminated to members of the College community via an electronic report. Source of Evidence: Service Quality about the quality of services provided by the Office of Curriculum Services. M 6: Collegewide Focus Group Feedback During the Fall 2012 semester, the Curriculum Services team will conduct a collegewide focus group about the quality of services rendered to faculty, administrators, and staff. Questions will address the helpfulness of feedback during proposal development, the usefulness of the technical review, and strategies for enhancing the services. Staff members in Curriculum Services will produce a written transcript of the focus group and then conduct a qualitative analysis of recurring themes and patterns. Finally, a report of the salient findings will be written and disseminated to members of the College community via email. The Curriculum Services Office will gather at least three recommendations for continuous quality improvement as a result of the focus groups. 2012-2013 Institutional Effectiveness and Accreditation Mission / Purpose The Office of Institutional Effectiveness and Accreditation provides leadership, support and resources for institutional effectiveness and regional accreditation. These processes assist the institution in maintaining SACS accreditation, promoting its achievement of mission and goals, and fostering continual enhancement of the institution's programs and services for the benefit of the College community. OIEA is responsible for SACS accreditation correspondence and reports, Quality Enhancement Plan coordination, Substantive Changes, and other reaffirmation and compliance activities. OIEA supports annual institutional effectiveness activities of academic programs, educational support services, and nonacademic units. Other Outcomes/Objectives, along with Any Associations and, Targets, Findings, O/O 1: Substantive Change Procedure 1 Prospectus Applications (2011-2012) (Assessed Again 2012-2013) The Office of Institutional Effectiveness and Accreditation completes Substantive Change Procedure 1 Prospectus applications in a timely manner.

2 The College will be innovative, resourceful, effective and accountable in the pursuit of these goals. Student completion of degrees and certificates is a priority. Standards of performance for employees and organizational units will be of the highest order with a clear expectation of continuous quality improvement. Ultimate accountability shall pertain to demonstrated outcomes and other definitive evidence of success pursuant to the College's comprehensive institutional effectiveness program. M 1: Substantive Change Procedure Level Determination (2011-2012) (2012-2013) In order to develop the appropriate SACS prospectus in a timely manner, the first step of the process is determining the type of Substantive Change and whether a prospectus is required. This determination should be conducted in a timely manner. During October 2012-April 2013, the Office of Institutional Effectiveness and Accreditation (OIEA) will track the time it takes to analyze a potential SACS Substantive Change and notify the program or unit with an internal judgment on the type of substantive change and procedure level required (Procedure 1, 2, or not applicable). OIEA will developing a tracking log that contains the program/unit name, the date the Substantive Change form is submitted (complete), the date the judgment is made, the type of Substantive Change, the number of business days to make the judgment, the date that the program/unit is notified, and the reason for delay (if applicable.) OIEA staff will analyze the tracking log across all Substantive Change forms across each of the log elements specified above. 90% of all Substantive Change procedure determinations should be made by the Director of OIEA within 5 business days of receiving a completed Substantive Change Intake Form from the academic program or unit. Related Action Plans (by Established cycle, then alpha): Substantive Change Procedure Determination The department recognizes that the target was partially met and the impact of workload on the finding. We department would like to remeasure this important outcome again for 2012-2013 cycle and focus on meeting the target within the specificed 5 business day for the time period October 2012-April 2013. Established in Cycle: 2011-2012 Implementation Status: Planned Priority: High Relationships (Measure Outcome/Objective): Measure: Substantive Change Procedure Level Determination (2011-2012) (2012-2013) Outcome/Objective: Substantive Change Procedure 1 Prospectus Applications (2011-2012) (Assessed Again 2012-2013) Responsible Person/Group: Associate Vice President of Institutional Effectiveness and Accreditation M 2: Evaluation of Timeliness of Substantive Change Prospectus Development by Prospectus Team (2011-2012) (2012-2013) From October to December 2012, the director of OIEA will disseminate a survey to determine the timeliness of the SACS Substantive Change Prospectus development period. The Survey will be disseminated to College employees who were involved in the development of a prospectus during this timeframe. The survey will include questions regarding the proposed development timeframe, the timeliness of receiving feedback on drafted sections of the prospectus. The survey will be designed with a likert scale (1 is strongly disagree, and 5 is strongly agree). The survey will also include open ended questions regarding ways to streamline the development process and requests for additional information to make the process more efficient. The responses will be analyzed across all respondents per question. Source of Evidence: Evaluations 80% of the responses for each of the two criteria will be agree (4) or strongly agree (5). Related Action Plans (by Established cycle, then alpha): Prospectus Development Timeliness The department did not meet the target of 80% for the 2011-2012 cycle. One of the findings of the survey was the impact of obtaining the curriculum committee meeting mintues for evidence in the prospectus package. In discussion with the curriculum services manager a solution was determined and thus the department would like to reassess this outcome and measure in 2012-2013 with the following adjustments: 1. Develop the prospectus timeline around the curriculum committee process. 2. Restart a practice used in the past whered a launch conference call is held will all participants involved to lay out the timeline and obtain suggestions on it. 3. Add to the timeline the date in which the OIEA department provides feedback on drafted sections. Established in Cycle: 2011-2012 Implementation Status: Planned Priority: High Relationships (Measure Outcome/Objective): Measure: Evaluation of Timeliness of Substantive Change Prospectus Development by Prospectus Team (2011-2012) (2012-2013) Outcome/Objective: Substantive Change Procedure 1 Prospectus Applications (2011-2012) (Assessed Again 2012-2013) Responsible Person/Group: Associate Vice President of Institutional Effectiveness and Accreditation O/O 2: Substantive Change Procedure 2 Notifications (2012-2013) The Office of Institutional Effectiveness and Accreditation processes Substantive Change Procedure 2 notifications in a timely manner.

2 The College will be innovative, resourceful, effective and accountable in the pursuit of these goals. Student completion of degrees and certificates is a priority. Standards of performance for employees and organizational units will be of the highest order with a clear expectation of continuous quality improvement. Ultimate accountability shall pertain to demonstrated outcomes and other definitive evidence of success pursuant to the College's comprehensive institutional effectiveness program. M 3: Analyzing potential SACS Substantive Change From March 1 to May 31, 2013, OIEA will track the time it takes to analyze a potential SACS Substantive Change (substantive changes to program offerings, mission, initiation of coursework through contracts, off-site locations, etc.) and notify the program or unit within two weeks with an internal judgment on the type of substantive change and procedure level required (Procedure 1, 2, or not applicable). OIEA staff will analyze the tracking log by types of substantive change, date request received, date internal judgment made, number of business days to make internal judgment, and reason for delay (if applicable) to determine patterns of efficiency and delay. 90% of SACS Substantive Change internal judgments will be made within 2 weeks of a submission (with complete information) to OIEA. M 4: Submitting Procedure 2 letter of notification to SACS For Procedure 2 Substantive Changes, the Office of Institutional Effectiveness and Accreditation (OIEA) will track the time it takes to submit a Procedure 2 letter of notification to SACS after making the internal judgment on the procedure level required. This will be tracked from March 1 to June 15, 2013. OIEA will maintain a tracking log with the type of substantive change, date of internal judgment, date that Procedure 2 letter of notification is sent to SACS, number of business days to submit letter, and reason for delay, if applicable. OIEA staff will analyze all log data across all substantive changes, and look for patterns in efficiency and delays. 90% of SACS Substantive Change Procedure 2 Notification Letters will be sent to SACS COC within two weeks of OIEA making the internal judgment on the required Procedure 2 level. O/O 3: Institutional Effectiveness Committee (2012-2013) The Office of Institutional Effectiveness and Accreditation provides effective and timely orientation, training and support for members of the collegewide Institutional Effectiveness Committee. 2 The College will be innovative, resourceful, effective and accountable in the pursuit of these goals. Student completion of degrees and certificates is a priority. Standards of performance for employees and organizational units will be of the highest order with a clear expectation of continuous quality improvement. Ultimate accountability shall pertain to demonstrated outcomes and other definitive evidence of success pursuant to the College's comprehensive institutional effectiveness program. M 6: October 2012 Committee Survey After conducting Institutional Effectiveness Committee Rubric Range-Finding training in October 2012, OIEA staff will administer a survey to committee members. In this survey, committee members will be asked to respond anonymously to a series of questions about the effectiveness and timeliness of the training. The survey will be designed on a likert scale of strongly agree to strongly disagree. The survey will include timeliness of the training (as it relates to the committee members' role in reviewing IE Assessment plans/reports), the effectiveness of the training in preparing committee members' review of IE Assessment plans/reports. The likert scale survey responses will be analyzed across all respondents by effectiveness questions, timeliness questions, and for suggestions in the open ended responses. The results will be able to be disaggregated by new and returning committee members. Source of Evidence: Client satisfaction survey (student, faculty) 80% of the responses for each question will be agree or strongly agree. M 7: April 2013 Committee Survey In April 2013, before the committee conducts Academic assessment plan/report reviews, OIEA would like to resurvey IE Committee members. OIEA Staff will administer a survey asking them to respond anonymously to a series of questions about the support, training, and orientation that has been provided to them since the beginning of the committee convening in September 2012. The survey's purpose will be to obtain committee members' perceptions on how well the orientation, training and support helped them to fulfill each of the the committee's charges, such as working group goals, assess the institutional IE culture, etc. The survey will be designed on a likert scale of strongly agree to strongly disagree. The survey responses will be analyzed across all respondents and for suggestions in open ended responses. The results will be able to be disaggregated by new and returning committee members. Source of Evidence: Client satisfaction survey (student, faculty) 80% of the responses for each question will be agree or strongly agree. 2012-2013 Resource Development

Mission / Purpose The mission of Resource Development is to seek and obtain external funds from all available sources for the support of existing programs, development of new instructional techniques, evaluation of program effectiveness, and for other purposes that will improve student achievement in Florida State College at Jacksonville. Other Outcomes/Objectives, along with Any Associations and, Targets, Findings, O/O 1: Process grant applications Resource Development processes grant application to meet announced deadlines. M 1: Data on proposal submission Number of proposal ideas originating within the department in comparison to the number of proposals considered worthy of developing and pursuing At least 75% of proposal ideas initiated within the department will be found to be worthy of pursuit and submitted to the funding agency. Finding (2012-2013) - Met 81% of proposals initiated within the department were submitted within the specified timeframe. M 2: Data on submission by deadline Number of proposals submitted within the timeframe established by the funding agency At least 90% of all proposals initiated by the department will be submitted on or before the deadline. Finding (2012-2013) - Met 100% of proposals initiated by the department and completed were submitted within the agency's specified timeframe. O/O 2: Assistance to design teams Resource Development provides assistance to design teams in the development of a proposal. M 3: Data on design team meetings Number of face-to-face design team meetings conducted in which key members of the project outline the project's goals and objectives During the development process of at least 90% of submitted proposals, at minimum one design team meeting will have been held with at least one principal member of the project attending to provide input on the project's aims and objectives. Finding (2012-2013) - Met 95% of submitted applications involved at least one face-to-face design team meeting attended by at least one team member. M 4: Number of design team members Minimum number of people involved with project necessary for a successful design team meeting At least one person directly involved in the proposed project will attend at least one design team meeting to ensure that the objectives being included in the proposal conform to the aims of those administering and implementing the project. Finding (2012-2013) - Met At least one campus or staff member directly involved in the proposed project was in attendance in 100% of design team meetings. M 5: E-mail messages from clients E-mails from those working with the department on proposal development will be used to gauge client satisfaction with the process Source of Evidence: Client satisfaction survey (student, faculty) At least six e-mail messages [sent to Resource Development Director or each Resource Development Officer (RDO)] during the year will indicate client satisfaction with proposal development or design team meetings.

Finding (2012-2013) - Met 16 unsolicited e-mail messages were received by Resource Development Officers that indicated appreciation with the proposal development process and the final application. O/O 3: Maintain database Resource Development maintains a process to monitor grant opportunities, proposals under development, and awarded projects. 1.5 Collegewide Goal Five (as of Aug 2011): Contribute significantly to the ongoing economic development of the Northeast Florida region. M 6: Tracking grant activity through data Reports are listed in comprehensive database, notebooks, shared files, and annual reports that are available in the Resource Development office upon request. Records in the database allow the department to track award amounts, proposals submitted and number of proposals funded for current year and prior years. 90% of grant activities are listed in database, notebooks, share files, and annual reports Finding (2012-2013) - Met 100% of grant activities were recorded in the department's database, in notebooks, and in shared directories; an annual report was compiled and distributed ; proposal tracking was begun through IT Works. 2012-2013 Student Analytics and Research Mission / Purpose The office of Student Analytics and Research (SAR) directly supports the advancement of the College mission, vision, values, and goals by developing and improving data-driven decision -making and contributing to the cycle of continuous improvement. Through the application of diverse analytical frameworks and methodologies and via key collaborative efforts, Student Analytics and Research engages in a range of interrelated project activities involving focused student related research and analysis to extract practical and actionable information for College decision-makers. Student Analytics and Research (SAR) assists college-wide leadership fulfill the mission of the institution by transforming data into information for decision making. Focused primarily on research and analytics to improve student access, retention, success, learning, and overall academic experience, SAR is actively engaged in identifying and applying advanced analytical technologies (e.g., predictive analytics) to gauge and improve student performance. Ancillary SAR roles and responsibilities involve project collaborations with a wide range of internal and external institutional partners. Internally SAR serves and supports college-wide leadership in all college divisions. SAR works directly with a growing number of offices, groups, and departments including Institutional Effectiveness and Accreditation, Liberal Arts & Sciences, Workforce Development, Resource Development (grants), Learning Research, Information Systems/Collegewide Data Reporting, as well as an expanding contingent of faculty groups to provide timely analysis, consultation, and guidance. In addition to fulfilling the institutional research role for the college by administering, analyzing, and reporting nationwide comparative testing, survey, and benchmarking initiatives (e.g., ETS Proficiency Profile, CCSSE, SENSE, NCCBP), SAR also plays an increasing role in the rapid development, deployment, analysis, and reporting of emergent and customized college-wide instruments including student and faculty surveys. Externally, SAR maintains a network of related professional and organizational affiliations to identify, integrate, and apply best practices in varied disciplines including data and text mining, statistical modeling, program evaluation, and institutional research. SAR also serves in both project management and project support roles to facilitate the completion of key college initiatives involving service expansion and institutional effectiveness impacting student success beyond the classroom and the economic growth and development of the region. Other Outcomes/Objectives, along with Any Associations and, Targets, Findings, O/O 2: Requests Turnaround SAR processes data and survey analysis requests in a timely and effective manner 1.5 Collegewide Goal Five (as of Aug 2011): Contribute significantly to the ongoing economic development of the Northeast Florida region. M 3: Analysis of departmental requests Analysis of JIRA output from January 1, 2011 through June 30, 2011 using the baseline to determine an average

length of time needed to complete data requests Analyze 80% of requests to establish a baseline Related Action Plans (by Established cycle, then alpha): Baseline for request completion The Student Analytics and Research unit exceeded its target of reviewing 80% of the total data request entered into the JIRA system during the January 1, 2011 through June 30, 2011 period. Total request volume was manageable to review 100% of the request. The calculated 30 day average request turnaround will be used as a baseline for future process improvements to be considered in a later cycle. Established in Cycle: 2010-2011 Implementation Status: Finished Priority: High Relationships (Measure Outcome/Objective): Measure: Analysis of departmental requests Outcome/Objective: Requests Turnaround O/O 3: Instructional JIRA Video The office of Student Analytic s and Research has worked with Learning Innovations to produce a instructional video illustrating to individuals how to create a request in JIRA. This will assist individuals who have not used JIRA to input their own requests. Additionally we will survey individuals after they have completed a request to obtain feedback to find out if the video is helpful. General Education/Core Curriculum Associations 1 Communication Major Priorities Associations Florida State College at Jacksonville 1 Secure the reaffirmation of the College's accreditation with an emphasis on improving student completion M 4: JIRA Vid focus group SAR will conduct a series of surveys and one on ones to evaluate the ease of understanding and analyze the surveys and input for that. Source of Evidence: Video or audio tape (music, counseling, art) 75% of the target group will respond to or answer questions regarding the video.