Capstone Suggestions for Survey Development for Research Purposes



Similar documents
A Guide To Understanding Your 360- Degree Feedback Results

*Note: Screen magnification settings may affect document appearance.

Using Surveys for Data Collection in Continuous Improvement

Surveys. Methods for Evaluating Advising Program/U.C. Berkeley 1

Examples of Teacher-Designed/Scored Feedback Questionnaires A Guided Self-Analysis for Beginning Instructors

ASSESSMENT PLAN UNDERGRADUATE MAJOR IN POLITICAL SCIENCE. Mission Statement

The Department Faculty approved the following student learning outcomes for the Undergraduate BA/BS major in April 2011.

Client Satisfaction Quality Improvement Tool and Sample Surveys

Writing Good Survey Questions Tips & Advice February 21, 2013

Noel- Levitz Student Satisfaction Inventory Results: Disaggregated by Undergraduate and Graduate

Usability Evaluation with Users CMPT 281

McClelland_Robert_425

Employee Engagement Survey Results. Sample Company. All Respondents

Collecting Evaluation Data: End-of-Session Questionnaires

MASTERS IN COUNSELOR EDUCATION PROGRAM EVALUATION REPORT

Concordia University Course Evaluation Survey: Summary Report

August Page 1 PSA

Survey Report Finance

ERAS Online Q&A. Sections. September 8, 2014

PhD in Leadership Studies. University of San Diego Exit Survey Results

EMPLOYEE JOB IMPROVEMENT PLANS. This Employee Job Improvement Plan designed by Kielley Management Consultants achieves results because:

Sample Survey Questions by Type

M.A. in Counseling. School Counseling Specialization. University of San Diego Exit Survey Results

Writing Thesis Defense Papers

WORKSHEET Meeting ONE (1/27/09) Technical Writing Sp 09 / Harvard / Kalo

Survey research. Contents. Chaiwoo Lee. Key characteristics of survey research Designing your questionnaire

Effective Performance Appraisals

Multiple Subjects Program Assessment Plan Spring 2001

Employee Surveys as a Management Tool. Dr. Mark Ellickson Opinion Research Specialists, LLC

Program Learning Outcomes Data Analysis/Reporting Workshop. Center for Teaching & Learning AUB February 10, 2011 Facilitator K. El Hassan, PhD.

Examining the Role of Online Courses in Native Hawaiian Culture and Language at the University of Hawaii

Blackboard Pilot Report July 12, 2013

. Copyright 2003 by Aguirre International Developed by JBS International for the Corporation for National & Community Service 1

Kate Pascale and Associates Supporting Proactive and Informed Change Consumer survey tools and templates

MPA Program Assessment Report Summer 2015

Advising mothers on management of diarrhoea in the home

CREATING CONDITIONS FOR EFFECTIVE PUBLIC ENGAGEMENT

Mini-Guide to Selecting and Working with Consultants

How many children do you have involved in the church s children s ministry? or more

Graduating Senior Survey Report Sport Management, COEHS

MSU IDEA Pilot Study Preliminary Results

LAMAR UNIVERSITY CULINARY CERTIFICATE PROGRAM (Dept. of Family & Consumer Sciences) Graduate Exit Interview

Sub-question 6: What are student and faculty perceptions concerning their frequency of use of the DDP?

What Do The Numbers Tell Us?

1. What is your primary motive for attending this farmers market? (choose only one) Purchase produce. Events/activities Purchase packaged goods

MEd Special Education Program. University of San Diego Exit Survey Results

The Sport Management Internship Handbook

Prepared by Ipsos MRBI for the Health Service Executive

THE GOAL OF INFORMED INDEX.

Master of Public Health Student Survey (2014) The University of Tennessee

Los Angeles WorkSource System Exited Adult Participant Customer Satisfaction Survey

Peace Corps Fellows Scholarship Application for Returned Peace Corps Volunteers

Workforce Insights Employee Satisfaction Surveying

Online Survey of Employees Without Workplace Retirement Plans

Law Enforcement Aptitude Battery. Assessment Preparation Guide

Master of Public Health Student Survey (2013) The University of Tennessee

TIPSHEET IMPROVING RESPONSE SCALES

Miramar College Employment/Payroll Point-of-Service Survey Item Analysis Spring 2009

Staff Perceptions of the Professional Practice Environment 2006

MEMORANDUM. RE: MPA Program Capstone Assessment Results - CY 2003 & 2004

COLLEGE OF ADMINISTRATIVE AND FINANCIAL SCIENCES (CAFS) Bachelor of Science in Business Informatics (MBA) Batch SY Employer Survey

Research Ethics Review Committee (WHO ERC)

Student Response to Instruction (SRTI)

Personal Development Program (PDP)

UNH Graduate Education Department. Quarterly Assessment Report

Student Feedback on Online Summer Courses

COURSE PREFIX AND NUMBER: -

DESCRIBING OUR COMPETENCIES. new thinking at work

BUSINESS REPORTS. The Writing Centre Department of English

A MyPerformance Guide to Performance Conversations

FIELD EDUCATION PROGRAM OVERVIEW

How To Be A Successful Customer Service Member At Walter'S Health Care Insurance

Exit Survey Results For Preliminary Education Specialist Teaching Credential

Developing Self-Directed Learning Strategy for a Construction Project Management Course

Types of Research. Strength of Research. Data Collection 9/13/2011

CONDUCTING MARKET RESEARCH

COURSE AND TEACHER SURVEYS (CATS) AT VILLANOVA UNIVERSITY

How do we know what we know?

Survey of Injured Workers: Claim Suppression and Direction of Care

Measuring the response of students to assessment: the Assessment Experience Questionnaire

Curriculum Development in 6 easy steps for busy Med-Ed types. Curriculum Development in MedEd

Manual on Training Needs Assessment

9. Data-Collection Tools

1. My placement in calculus was determined by (Mark all that apply): 2. Did you take the SAT exam? 3. My SAT scores were: Page 1

Complete a Relationships Presentation

COMPETENCY MODEL FOR PLANNING ASSISTANT CLASS CODE 7939

INTERNATIONAL RELATIONS PROGRAM EXIT INTERVIEW: ANALYSIS OF STUDENT RESPONSES. By: Meghan I. Gottowski and Conor N. Rowell

Sample Survey Questions & Tips Feedback Made Easy

Seattle Youth Violence Prevention Initiative Risk Assessment Tool Validation Findings and Recommendations from Quality Assurance Interviews

2. SUMMER ADVISEMENT AND ORIENTATION PERIODS FOR NEWLY ADMITTED FRESHMEN AND TRANSFER STUDENTS

MODULE 16 Interpret the purpose and use of a performance evaluation and complete a self-evaluation.

HUMAN RESOURCE PROCEDURE GUIDE CONDUCTING WORKPLACE INVESTIGATIONS

How To Read A Company Annual Report

Salmabad, Kingdom of Bahrain

Final Evaluation Report

Designing & Conducting Survey Research

Market Research: Friend or Foe to Christian Charities?

MIA/MPA Capstone Project Management Resource Materials

Online Customer Experience

Spring 2013 Structured Learning Assistance (SLA) Program Evaluation Results

Transcription:

Capstone Suggestions for Survey Development for Research Purposes 1. Begin by listing the questions you would like to answer with the survey. These questions will be relatively broad and should be based on theory and past research. (A typical survey for research may attempt to answer 2-4 major research questions, though you may have additional small questions..) 2. Having determined the questions you would like to answer with the survey, consider the appropriate item types to use for each question. To measure attitudes or opinions, consider Likert scale items or adjective items (for overall rating items excellent, good, fair, etc.), or, in rare cases, multiple choice items.. To measure behavior, consider using multiple-choice or adverb items for perceptions of frequency of behavior; To measure knowledge, use multiple choice or open-ended items. Demographic information can also be obtained using multiple-choice or openended formats. Use open-ended items sparingly. They are useful in avoiding a response set (respondents selecting a socially desirable option), but are very time-consuming for respondents to complete. Too many open-ended items requiring long responses will result in a reduced response rate. (The preceding is true for mailed surveys, but would not hold for phone surveys or interviews where open-ended items can be quite useful.) 3. Introduction to the survey: Tell the respondents briefly what the purpose of the survey is, the agency and/or persons responsible for the survey (to establish credibility), if appropriate the funder, and appeal to their self-interest to get them to complete the survey. There is no need to be formal here. Use the style that is most appropriate for the audience. If being "folksy" will get a better response rate, do it! If being "folksy" will hinder your credibility with the audience, don't do it! I think it's fine to start with "We need your help. You've been selected from a small sample to provide us with information on.... Your honest, careful responses will help us improve... " Finally, give a phone number they can call if they have questions and offer to send them the results of the survey. If it's an internet or mailed survey, all but the specific instructions should be in the cover letter or cover page. If mailed, the cover letter should be on good stationary using the letterhead of the organization. 4. Format of the survey: You have two goals: (1) to make the survey appear short (not many pages) and, (2) to make it look easy to complete (not too crowded, plenty of space in margin and for items). As indicated in #1, don't ask any unnecessary items. Use large margins. The order of the survey should have a logical flow in terms of the content of the items. Start with items that will seem important to the respondent and relate to the purpose of the survey. Do not start with items that would seem intrusive or threatening. Feel free to use headings to separate various types of items. Occasionally, you may want to have some new instructions or message after a new heading. For example, I often precede the

demographic items (which can seem irritating to the respondent) by saying, "Now we need some information about you to help us determine the needs of specific groups of clients" or now we need some information about you to help us learn more about how different groups think about this issue. Such instructions help the respondents know you're not just prying into their personal life with no reason. Having developed your headings and instructions, be sure to group items of similar types together. It is irritating to the respondent to skip from one item type to another. 5. Developing the actual items: For Likert items: These are very useful for assessing attitudes toward different aspects of one thing, e.g. attitudes toward a workshop, a policy, etc. These types of items provide far more useful information on what changes you should make than an overall rating item. Further, the Likert scale (5 to 7 points) provides the respondent with a continuum that better represents an attitude than does a "yes-no" response and will provide you with more variability in your responses. If you're measuring an attitude, first look in standard references cited in your text or in articles using surveys to determine if there is an existing, already validated attitude measure. If there is not, you will have to make your own. Develop sentences that reflect an attitude that persons with extreme beliefs on the dimension you are measuring would have. Try to avoid sentences that state facts; the sentences should reflect statements upon which respondents can legitimately agree or disagree. Don't have all your sentences requiring an "agree" response. Among the first few sentences, have some sentences for which the desired response is a "disagree." This ordering will avoid a response set and encourage the respondent to read all items carefully. Have your sentences address different components of the overall thing you are measuring. For example, if you're measuring attitudes toward an agency, begin with reception/orientation, move on to services, coordination, and, finally, exit, overall satisfaction. With careful construction of sentences you can measure many different issues in a brief space. Put the scale above the items, and put a line beside each sentence for participants to write the number that best represents their reaction to the sentence. This format takes up the least space. I recommend a 6- point format of Strongly Agree, Moderately Agree, Mildly Agree, Mildly Disagree, Moderately Disagree, Strongly Disagree for typical adult audiences. Such a format forces respondents into some type of agreement or disagreement which is often desired. If you feel your respondents might truly have no opinion, use a five-point format: Strongly Agree, Moderately Agree, Neutral, Moderately Disagree, Strongly Disagree. (I think a 7-point format is too confusing.) For children or audiences with little education, you might consider a 3-point format: Agree, Neutral, Disagree. Use this format only if your audience will not understand the other format; the 3-point format greatly reduces your variability. (On many questionnaires, the main differences are between moderately and strongly agree. Once you eliminate that, you have everyone agreeing, and you can't differentiate strong from moderate support.)

Use adjective items (Excellent, good, fair, poor) only for overall rating items. These items are of limited use. They tell you nothing about what causes a good or poor rating. One person's good rating is another's poor. Such items can be useful as a "red flag" or screening item. In such a case, your survey may be the first in a series. You may use rating items to identify areas in which there is dissatisfaction. Future surveys may collect more information on why there is dissatisfaction with the area. Alternatively, you could follow an adjective, rating item with an openended item asking respondents to indicate why they have given the rating. Adverb items are useful for getting participants' perceptions of frequency of behavior. How often do they perceive they do a particular thing or receive a particular service from your agency? Make sure your adverbs form a continuum. Words to consider: Always, Frequently, Often, Occasionally, Sometimes, Seldom, Rarely, Never. If you want to measure the actual behavior, you should use a multiple-choice item listing the actual frequency of behavior or a short-answer, open-ended item asking respondents to indicate the frequency of the behavior in the blank. Always consider if they will know the actual frequency. If they will be estimating or will not want to report the actual number (as with income), a multiple choice item with ranges is preferable. Be careful in developing the ranges to insure that the ranges will give you the information you need, i.e. you probably don't want everyone selecting the same range. Multiple-choice items: These are very useful for measuring behavior (see above under adverb item) or beliefs, e.g., which of the following are you most likely to use on your job? In writing multiple-choice items, be sure all responses follow and read logically from the stem, make each response is mutually exclusive (not overlapping with other responses), and exhaustive (include all possible options or an "other" option). Don't make the choices too long. If the item is long, try to get as many words as possible in the stem, rather than repeating them each time in the options. 6. Regarding wording of items: Consider whether your responses are appropriate for the thing you are measuring. A yes-no item should be used only for things that are clearly yes-no, not a continuum. "Did you receive a syllabus?" is a yes-no item. You either did or you didn't. "Was the syllabus helpful in orienting you to the course?" is an item that would be better measured with a continuum, not "yes-no" as the syllabus could be very helpful, somewhat helpful, etc. As indicated above, if you have several items with a continuum, save space by converting all questions to sentences and use a Likert scale for all. For example, you could change Was the syllabus helpful in orienting you to the course? to this Likert item: The syllabus provided an excellent orientation to the course. Strongly agree, moderately agree, mildly agree, etc. Make items as brief as possible. This is not the time to be unnecessarily wordy. Surveys should be edited many times to reduce and simplify the wording. Don't use jargon. Use a vocabulary level your audience can understand. (If your audience is more comfortable with jargon, use it.)

Make sure the reader will understand how to complete each item. Add instructions where necessary such as when the item format changes. Should they check only one? all that apply? Should they circle the option or write a response in a blank? Carefully consider the wording of open-ended items. Especially consider the verb you're using and make it as specific as possible. "Explain why the MPA program has been useful to you." is much more vague than "If you have found the MPA program to be useful to you, describe how you have been able to use one aspect of the program in your current work." The only time your open-ended item should be vague is at the end of the survey when you might want to ask, "Is there anything else you would like to tell us about...?" Such an item is appealing to those who have concerns and can you with provide unanticipated information. 7. Be sparing on demographic items. Consider how knowledge of each piece of demographic information will be used. Such items are frequently "nice to know" items. Only use those items that are essential to describing your sample or that you need in order to identify sub-groups of interest. Put demographic items at the end. 8. Pilot test your survey with a sample of people who are like your final audience. Administer it to them in the same way you will administer it to the final audience. (If you intend to mail the survey, hand the audience the survey in the envelope with the cover letter and give no oral instructions.) Have them complete the survey as they normally would though you can instruct them to mark any item or options they see as unclear. After they finish the survey, discuss with them orally how they perceived it. Did they react positively? Were the instructions clear? Were the items clear? Was the language used appropriate? Take up their surveys and analyze the data. Look at the trends in the data and see if you got what you expected. Did people respond to the open-ended items in the way you anticipated? On multiple-choice items, did people select a variety of options? Revise your survey based on the oral and written responses.

Survey Plan: Student Opinions of the MPA Program Questions Item(s) 1. Do students believe that the primary components of the program are achieving their intended purposes? (Key components: courses, faculty, schedule, facilities, advising) (Courses: interesting, practical, at appropriate level, meet perceived needs) I. 1-11 What changes do students recommend? I. 12,13 II. 8 2. Do students feel fully informed about the new degree I. 6,11 program and the options they can pursue? 3. What are students most interested in learning more I. 13 about? 4. How did students first learn of the program? Why did II. 2,3 they enroll? 5. How many students are currently working in II. 4-6 the public or non-profit sectors? are managers? are managers in either of these sectors? What are the most common sectors of employment for our students? 6. How many students would like assistance in job placement from us? II. 7 7. Is there a difference in satisfaction of students by concentration? II.1

Student Opinions about the MPA Program The SPA faculty are reviewing our program to determine how we might improve it. Students feedback is a critical part of this process. Please help us by completing this short survey. Your thoughtful and honest responses will help us revise the MPA program to better meet your needs. Read each item carefully and respond as instructed. Your answers are anonymous and all responses will remain confidential. I. Use the scale shown below to respond to items 1-11. Write your answer on the blank beside each item number. 1=Strongly agree 2=Moderately agree 3=Slightly agree 4=Slightly disagree 5=Moderately disagree 6=Strongly disagree 1. The content of the courses is very stimulating to me. 2. The amount of work required in the courses is too demanding and difficult. 3. The faculty are helpful and friendly. 4. The content of the courses should be more advanced, delve deeper into the subject matter. 5. The scheduling of courses matches well with my own scheduling needs. 6. I don t feel fully informed about the program and my options. 7. The content of the courses is very practical. 8. The classrooms are comfortable and make learning easy. 9. The faculty needs to work to make some of the courses more interesting. 10. I m learning a lot that I have been or will be able to put to use in the workplace. 11. I know exactly what I need to do to complete the program.

12. Select one of items 1-11 in which you indicated dissatisfaction. How would you recommend we change the program to improve this area? 13. What do you think is missing from our curricula? What would you like to learn more about? II. Now, we would like to learn a little bit about you. This information will help us determine how we re meeting the needs of groups of students. Please read each item and circle the response that best reflects your status or needs. 1. Program concentration: a. Public Administration b. Nonprofit Management c. Environmental Policy d. Local Government e. Unsure 2. How did you first learn of the program? (Circle all that apply.) a. As a student at Auraria b. Flyer/brochure c. From a friend who was a student or alumnus of the program d. From a co-worker who was a student or alumnus of the program e. From a talk by a SPA representative (advisor, faculty, Dean) f. Other: 3. Why did you enroll in the program? (Circle all that apply.) a. I enjoy going to school and learning b. To prepare myself to seek employment in a new area c. To increase my chances for promotion d. To improve my job performance in my current position e. To meet other people in my field f. Other (Please describe: ) 4. Current employment status: a. Employed full-time b. Employed part-time c. Not employed outside the home (For questions 5 & 6 respond in reference to previous employment.)

5. Current sector of employment: a. Private b. Nonprofit c. Public 6. Nature of your work responsibilities: a. Clerical/sales b. Service delivery c. Supervisor d. Professional e. Manager f. Other: (Please describe): 7. Would you like assistance from SPA in obtaining degree-related employment after completion of your degree? a. Yes, definitely b. Yes, possibly c. No, I will seek employment on my own. d. No, I do not plan on seeking employment at that time. 8. If you could change one thing about SPA, what would it be? 9. What do you like best about SPA? Thanks for your help!