Analyzing Program Evaluation Data: How to Interpret Qualitative Data
|
|
|
- Scarlett Butler
- 10 years ago
- Views:
Transcription
1 Analyzing Program Evaluation Data: How to Interpret Qualitative Data Capt. Armen H. Thoumaian, Ph.D., USPHS Aaron Sawyer, Ph.D. Richard Best, Ph.D. Carmina Aguirre, M.A. April 21, 2015
2 Webinar Details This webinar presentation has been pre-recorded A live question-and-answer session will be held at the conclusion of the presentation Questions may be submitted anonymously at any time via the Question pod Audio for this presentation will be provided through Adobe Connect; there is no separate dial-in Live closed captioning is available in the Closed Captioning pod through Federal Relay Conference Captioning 2
3 Materials for Download Materials from this series and other program evaluation resources are available in the Files pod and at: For information on other DCoE webinar and training series, visit: 3
4 Continuing Education Details This continuing education activity is provided through collaboration between DCoE and Professional Education Services Group (PESG). DCoE s awarding of continuing education (CE) credit is limited in scope to health care providers who actively provide psychological health and traumatic brain injury care to active-duty U.S. service members, reservists, National Guardsmen, military veterans and/or their families. The authority for training of contractors is at the discretion of the chief contracting official. Currently, only those contractors with scope of work or with commensurate contract language are permitted in this training. 4
5 Continuing Education Details (continued) If you wish to obtain a CE certificate or a certificate of attendance, you must complete the online CE evaluation. After the webinar, visit to complete the online CE evaluation, and download your CE certificate/certificate of attendance. The CE evaluation will be open through Tuesday, April 28,
6 Presenter Capt. Armen Thoumaian, Ph.D., USPHS Deputy Chief of Integration Office of Shared Services Support, DCoE Capt. Armen Thoumaian is a scientist director in the Commissioned Corps of the U.S. Public Health Service (USPHS) with more than 30 years experience in health and mental health program design and evaluation. In January 2012, Capt. Thoumaian joined the staff at the Defense Centers of Excellence for Psychological Health and Traumatic Brain Injury (DCoE) to help design and implement program evaluation and improvement efforts in the Defense Department. USPHS Capt. Armen Thoumaian, Ph.D. He holds a B.A. in psychology and sociology, a M.A. in general experimental psychology, and a Ph.D. in social welfare and social work, and has completed a National Institute of Mental Health fellowship in Community Mental Health. 6
7 Presenters Aaron Sawyer, Ph.D. Research Scientist, Contract Support for DCoE Dr. Aaron Sawyer is a clinical psychologist with extensive expertise in intervention outcome research and program evaluation. He has delivered child, family and adult interventions for more than a decade, including specialization in trauma and experience working with military families. Dr. Sawyer holds a M.S. in experimental psychology and a Ph.D. in clinical psychology. He completed post-doctoral training at The Kennedy Krieger Institute/Johns Hopkins University and is a licensed psychologist. Dr. Aaron Sawyer Richard Best, Ph.D. Research Scientist, Contract Support for DCoE Dr. Richard Best is an industrial and organizational (I/O) psychologist with 14 years of experience conducting health services research in both the Veterans Health Administration and the Defense Department s Military Health System. He has extensive experience in research design, qualitative and quantitative data collection and analysis, and collaborating with clinical experts to translate research results into actionable recommendations. Dr. Best holds a M.S. and Ph.D. in I/O psychology and is certified in Prosci s Change Management Process. Dr. Richard Best 7
8 Moderator Carmina Aguirre, M.A. Research Scientist, Contract Support for DCoE Ms. Carmina Aguirre has over 14 years of experience within the Defense Department. Her background includes executive leadership, psychological health, sexual assault prevention and response, and public affairs. In addition to supporting DCoE, she serves as Chief of Public Affairs in the Florida Air National Guard. Ms. Aguirre holds a B.A. in psychology and a M.A. in human services with a specialization in executive leadership. Ms. Carmina Aguirre 8
9 Overview and Objectives This training presentation will describe how to code, analyze and interpret qualitative data. Qualitative data include text from interviews, focus groups, written comments, observations and case studies. At the conclusion of this webinar, participants will be able to: Explain how qualitative data can be used as part of a mixed methods approach to program evaluation Describe the steps needed to organize and code qualitative data Perform basic qualitative analyses and communicate findings Select and implement strategies to address common challenges related to qualitative data analysis 9
10 Agenda Introduction to Qualitative Analysis and Mixed Methods Analyzing and Interpreting Qualitative Data Reporting Qualitative Findings Common Challenges Conclusion Resources and References Feedback and Q&A Session 10
11 Introduction to Qualitative Analysis and Mixed Methods
12 Introduction Image courtesy of Hubble Heritage Not everything that can be counted counts, and not everything that counts can be counted. -William Bruce Cameron 12
13 No Single Method Is Superior Just as no single treatment/program design can solve complex social problems, no single evaluation method can fully explain a program Qualitative evaluation methods provide a more complete picture than quantitative methods alone, especially with regard to program processes and participant experiences Qualitative methods help us understand the richness and complexity of psychological health and traumatic brain injury programs 13
14 What Are Qualitative Methods? Qualitative methods are forms of data collection and analysis based on text or other non-numeric information You are likely already engaged in qualitative methods: Logic model development Notes about program participants Meeting minutes Staff and participant feedback 14
15 Advantages of Qualitative Methods Qualitative methods have distinct advantages for understanding meaning, context and processes, including: Gathering detailed descriptions Identifying unknown or unanticipated phenomena Generating evaluation questions Developing causal explanations (Patton, 2014) Conveying individual narratives related to a program or population (Krueger, 2010) 15
16 Qualitative Methods Are Used to Address Specific Evaluation Questions Qualitative methods can be used to explore: Descriptive questions (e.g., What is the program s purpose?) Causal questions (e.g., Why are participants dissatisfied?) Value questions (e.g., Is the program worth continuing?) Action questions (e.g., How can the program be improved?) (Rogers & Goodrick, 2010) 16
17 Mixed Methods Can Produce More Complete Findings Mixed methods combine the benefits of both qualitative and quantitative methods and can: Assess size and frequency and explore meaning and understanding Answer multiple evaluation questions using tailored methods (e.g., focus groups and statistical analyses) 17
18 Key Differences Between Qualitative and Quantitative Methods Qualitative Quantitative Common data collection methods: Interviews, focus groups, open-ended comments, observations, after action reviews, case studies Common data collection methods: Questionnaires, learning assessments, structured screening protocols Data are generally text-based and more context-specific Answers why, how and/or what questions Data collection and analysis are generally time-intensive Data collection tools are often flexible Data are number-based and often apply to broader population Answers how many, who, when, and/or where questions Data collection and analysis are generally efficient Data collection tools are typically fixed 18
19 Mixed Methods Example Mission: At Program Sierra*, we seek to ensure that service members who are wounded, ill or injured successfully reintegrate into civilian life or return to duty in the military. By performing our mission effectively, we hope to enhance force readiness and improve the quality and efficiency of services across the Defense Department. DoD photo by Pat Cubal Program Sierra is formerly known as Program Echo. See Program Sierra objectives and logic model in slides at end of this presentation and Module 2 of the Program Evaluation Guide, 2 nd Edition 19
20 Mixed Methods Example (continued) Evaluation goals: Program Sierra leadership and stakeholders have stated that the program appears to be reaching its intended population but want to know 1) whether the program is being implemented with quality 2) whether program activities lead to expected outcomes for participants Program nature and intent: program has SMART objectives and a detailed logic model Program maturity: program is in the implementation stage but is regularly assessing some outcomes 20
21 Mixed Methods Example (continued) Evaluation design: Program Sierra should undertake a process evaluation focused on services provided directly to participants (versus outreach activities) but may incorporate some elements of a summative evaluation design to examine available short-term outcome data Key evaluation questions: Was the program implemented with fidelity (e.g., as intended or planned)? To what extent did the program achieve the desired shortterm outcomes? What should be improved or changed in the program to enhance its quality and effectiveness? 21
22 Mixed Methods Example (continued) Evaluation Question Methods Results What should be improved or changed in the program to enhance its quality and effectiveness? Quantitative Qualitative Satisfaction ratings have decreased from 85% to 42% over past five years Short-term outcomes showed significant improvement in attitudes but no change in service utilization from pre to post In focus groups, participants reported case managers were supportive but lacked knowledge of specific services relevant to their needs In interviews, program personnel reported need for additional training and challenges related to high turnover 22
23 Analyzing and Interpreting Qualitative Data
24 The Qualitative Data Analysis Process Organize Read and interpret data Develop initial coding themes Reduce Create a codebook Apply codes to data Check reliability REDUCE Describe Create visual display Communicate results 24
25 Organize: Read and Interpret Data Read the data to explore the range, depth and diversity of information collected Interpreting is the ability to think abstractly and draw out patterns in the data over multiple iterations (i.e., cycles) There are multiple ways to read the data: Literal reading focuses on actual content as-recorded, including grammar, structure, and content Interpretive reading makes sense of participant statements Reflexive reading examines the evaluator s role in collecting the information 25
26 Organize: Develop Initial Coding Themes A code is simply a way of classifying data into meaningful, relevant categories Take notes to form the foundation for your analysis including: Thoughts about the underlying meaning of a participant s statements Hypotheses that might explain a puzzling observation Mental notes to pursue an issue further Keep the purpose of your codes in mind codes should always be guided by your evaluation questions 26
27 Reduce: Create a Codebook A codebook maps the relationship between the raw data (e.g., text), themes and key questions guiding your evaluation Codebooks should include code names or labels, definitions, and inclusion and exclusion criteria: Clearly described when to apply a code Clarify when NOT to use a code Distinguish between codes Provide examples of correct application of a code 27
28 Example Codebook Entry Code Name Code Definition Inclusion Exclusion Example Text Stigma Positive experiences Negative experiences Service member descriptions of negative perceptions related to psychological health treatment within the Military Health System Service member descriptions of their prior positive experiences with health care providers Service member descriptions of their prior negative experiences with health care providers Apply to all instances of seeking help for psychological health issues Apply to favorable experiences specific to health care Apply to unfavorable experiences specific to health care Do not apply for nonpsychological health or civilian health system Do not apply to experiences that are negative or not specific to health care Do not apply to experiences that are positive or not specific to health care I m afraid I might lose my security clearance if I seek help for my nightmares. I know my doc is gonna take good care of me. I trusted my doc and then he ratted me out to my supervisor. 28
29 Reduce: Apply Codes to Data Apply codes to data by reading and re-reading data until no new themes emerge Codes may be refined, expanded or eliminated throughout this process Clarify contrasts and comparisons New patterns may emerge in the data, even at the latest stage Establish credibility by linking data to codes Document quotes from multiple participants that support the evaluator s interpretations 29
30 Reduce: Check Reliability Inter-coder agreement is the extent to which independent coders evaluate data (e.g., blocks of text) and reach the same conclusion Ideally, two or more people code the data and compare how they applied codes: Do two coders working separately agree on the definitions? Do they apply the codes in the same way? 30
31 Describe: Create a Visual Display When appropriate, use diagrams to show how something works or to clarify relationships between parts of a whole Unit Members Direct Experience Leadership Stigma Negative Experiences Others Experience Family Career Concerns Care- Seeking Behavior Media Depictions Example: A network diagram shows links between categories, variables or events over time 31
32 Describe: Communicate Results Keep in mind your audience may be unfamiliar with qualitative methods Clearly describe: Participant recruitment Participant characteristics Data collection and analysis procedures Reasoning behind conclusions Be mindful of confidentiality as the sources of qualitative data are often easier to identify than numerical data 32
33 Reporting Qualitative Findings
34 Purpose of Qualitative Data Analysis and Reporting The purposes of conducting analyses in program evaluation are to address questions about the quality or effectiveness of a program and to identify ways to enhance a program Evaluators report results of evaluations to: Demonstrate the importance and benefits of the program Provide accountability to stakeholders (e.g., funding sources, oversight agencies, advocacy groups) Generate additional support for the program Inform stakeholders about plans to improve quality and outcomes 34
35 Establishing Validity in Qualitative Methods Term Definition Evaluation Tactic Credibility Transferability Dependability Confirmability Extent to which data fit views of the participants or whether the findings hold true Extent to which findings are applicable to other populations and settings Extent to which data collection and analysis processes are logical and repeatable Extent to which data support the findings Member check - verify interpretations with participants Provide detailed descriptions about participants as well as the setting for the program evaluation Maintain or document detailed accounts of the program evaluation process Use multiple evaluators and examine potential biases 35
36 Managing Threats to Validity It is especially important when using qualitative methods to use practices that maximize data validity: Select participants representative of population Focus on common themes over infrequent responses Express data in participants words Document data collection and analysis procedures Check interpretations with participants If possible, compare interpretations across multiple evaluators 36
37 Reporting Qualitative Data When reporting the results of evaluations, present findings from qualitative and quantitative data together: Use graphs, tables, diagrams and key quotes Interpret findings and draw conclusions Support all conclusions with evidence through clear, consistent use of data (e.g., numbers, quotes) Identify implications for policy and practice 37
38 Example: Integrating Data from Mixed Methods Program Sierra administrators asked: Was the program implemented with fidelity (e.g., as intended or planned)? Quantitative data were derived from budgetary data and personnel records for the past five years: Category FY 2010 FY2011 FY 2012 FY2013 FY 2014 Change Program Budget $733,696 $707,472 $685,446 $645,798 $598,442 18% Personnel Professional - Social workers % Professional - Nurses % Paraprofessionals % Administrative staff % 38
39 Example: Integrating Data from Mixed Methods (continued) Qualitative data were collected using interviews with the current and previous program manager, and focus groups with service providers at two program sites Sample themes identified: Code Name Code Definition Example Quotes Prior experience Training needs Providerclient relationships Descriptions of ways in which prior experience and/or training related to quality of service provision Descriptions about ways in which increased training, especially for personnel with less prior knowledge, would enhance quality Descriptions of ways in which providers were able to form a working alliance with clients, their family members, and/or other providers involved in their care I had little formal training before coming to Program Sierra, so I had to figure it out as I went. - Provider We have a wide range of skillsets and training backgrounds, so it would help to provide more training and establish mentoring relationships. -Manager I have great relationships with my clients but need to learn how to work the service systems to better help them. -Provider 39
40 Example: Integrating Data from Mixed Methods (continued) Summary of key findings: Budget reductions led to replacement of professional providers (i.e., social workers, nurses) by paraprofessionals (i.e., unlicensed personnel with some specialized training) Limited training and experience have reduced service quality and the ability of providers to navigate local service systems Suggested improvements: Program Sierra should provide quarterly training workshops for junior personnel and paraprofessionals, followed by ongoing support and quality assurance Experienced personnel should be considered for mentoring and/or supervisory roles 40
41 Special Considerations for Handling and Reporting Qualitative Data Qualitative data collection, analysis and reporting pose extra risks, because data can be more easily identified Limit access to primary data during analysis to only those absolutely need access Use pseudonyms in place of names and only general demographic information (e.g., a Naval officer vs. a 32-year-old Navy Lieutenant stationed in Norfolk, Virginia ) Follow all applicable rules, regulations and agreements regarding data disclosure 41
42 Common Challenges
43 Common Challenges FAQ My staff lack the resources, such as time, training and materials to collect and analyze qualitative data. What types of qualitative data analysis software are available to support program evaluation? How can I use qualitative methods to improve my program? 43
44 My Staff Lack the Resources, Such as Time, Training and Materials, to Collect and Analyze Qualitative Data Program evaluation, whether carried out through qualitative and/or quantitative methods, is an important investment in a program s future Qualitative methods do not need to be overly complex or timeconsuming to be of benefit (e.g., comment cards, annual focus groups, observation) Over time, program evaluation results may be used to identify critical processes and eliminate or streamline others Many materials and training opportunities are free or low-cost, and consultation may be readily available from colleagues or researchers 44
45 What Types of Qualitative Data Analysis Software Are Available to Support Program Evaluation? Word processing and spreadsheet software (e.g., Microsoft Word and Excel) can be used for basic analysis functions, such as key word searches, sorting, filtering and creating simple graphics Free (e.g., CDC EZ-Text) or commercial (e.g., Atlas.ti or NVivo ) software includes advanced functions to identify and link common themes, record audio memos and create network diagrams Software choices depend upon the requirements of the data plan, complexity of analyses and training/support resources needed (Rogers & Goodrick, 2010) 45
46 How Can I Use Qualitative Methods to Improve My Program? Qualitative methods may be used to: Understand participants experiences with staff and services Improve fit between program and its context or population Overcome barriers to participation, low satisfaction, poor results Gather feedback from staff and stakeholders about potential improvements Real-life examples in military programs: Provision of services by unit-embedded providers to enhance program participation Modifications of program language to reduce stigma (e.g., customer vs. patient) Identification of unintended barriers to future help-seeking resulting from programs focused on improving resilience 46
47 Conclusion
48 Key Takeaways Qualitative data can provide a rich source of information about how a program operates and how it affects participants Qualitative data analysis involves identification of common themes used to address evaluation questions Qualitative and quantitative methods are complementary in that they both have unique strengths Photo by: Stewart Leiwakabessy 48
49 References and Resources
50 References and Resources Administration for Children and Families, Office of Planning, Research and Evaluation (2010). The program manager s guide to evaluation (2 nd ed.). Retrieved from U.S. Department of Health and Human Services website: Centers for Disease Control and Prevention (2011). Introduction to program evaluation for public health programs: A selfstudy guide. Retrieved from: Bamberger, M. (2012). Introduction to mixed methods in impact evaluation. The Rockefeller Foundation. Retrieved June 30, 2014 from: 9.pdf Creswell, J.W., Klassen, A.C., Plano Clark, V.L., & Clegg Smith, K. (2010). Best practices for mixed methods research in the health sciences. Office of Behavioral and Social Sciences Research, National Institutes of Health. Retrieved June 30, 2014 from: Krueger, R. A. (2002). Designing and conducting focus group interviews. Retrieved from: Krueger, R. A., & Casey, M. A. (2010) Focus group interviewing. In J. S. Wholey, H. P. Hatry, & K. E. Newcomer (Eds.) Handbook of practical program evaluation (3 rd ed., pp ). San Francisco: Jossey-Bass. Patton, M. Q. (2014). Qualitative research and evaluation methods: Integrating theory and practice (4th ed.) Thousand Oaks, CA: SAGE Rogers, P. J., & Goodrick, D. (2010) Qualitative data analysis. In J. S. Wholey, H. P. Hatry, & K. E. Newcomer (Eds.) Handbook of practical program evaluation (3rd ed., pp ). San Francisco: Jossey-Bass. United States Agency for International Development (2013). Technical note: Conducting mixed-method evaluations. Retrieved from: 50
51 References and Resources (continued) Agency for Healthcare Research and Quality: American Evaluation Association: Centers for Disease Control and Prevention, Program Performance and Evaluation Office: Center for Quality Assessment and Improvement in Mental Health: DCoE Program Evaluation: Defense and Veterans Brain Injury Center: Deployment Health Clinical Center: Medicine Sans Frontiers (Doctors Without Borders): Minnesota Department of Health, Quality Improvement Toolbox: National Center for PTSD: National Institutes of Health Toolbox: National Quality Forum: Qual Page, University of Georgia: 51
52 References and Resources (continued) Substance Abuse and Mental Health Services Administration, National Behavioral Health Quality Framework: University of Kansas, Community Toolbox: University of Wisconsin-Extension: U.S. Army Public Health Command, Behavioral and Social Health Outcomes Program (BSHOP): Services.aspx U.S. Department of Veterans Affairs, Health Services Research & Development: 52
53 Feedback and Question-and-Answer Session
54 Feedback and Question-and-Answer Session We are now open for a live question-and-answer session. Please submit your questions anonymously via the Question box located in the center of your screen. Your feedback is important! After the Q&A, please follow the displayed link to complete the Interactive Customer Evaluation (ICE) card Or, you may immediately access the ICE card via the Chat box Additional questions and comments may be directed to Capt. Armen Thoumaian [email protected] 54
55 Save the Date The next webinar in the DCoE PEI Webinar Series will be on May 19, 2015 from 1 2:00 p.m. ET Analyzing Program Evaluation Data: How to Interpret Quantitative Data May S M T W T F S
56 Program Sierra Example See also: Episodes 2 (Jan. 20, 2015) and 3 (Feb. 17, 2015) in the FY2015 DCoE PEI Series and Module 2 of the DCoE Program Evaluation Guide (2 nd Ed.)
57 Non-Clinical Program Example Mission: At Program Sierra, we seek to ensure that service members who are wounded, ill or injured successfully reintegrate into civilian life or return to duty in the military. By performing our mission effectively, we hope to enhance force readiness and improve the quality and efficiency of services across the Defense Department DoD photo by Pat Cubal 57
58 Non-Clinical Program Example (continued) Goal 1: Program Sierra helps service members transition to civilian life or return to duty with increased functioning and a sustainable, individualized system of support and care to meet ongoing needs Objective 1A: To assess all service members referred to the program and work with the service member and his or her family or caregiver to determine their needs and develop a plan for reintegration, followed by guidance sessions and service referrals Objective 1B: To increase use of services and supports for participating service members and enhanced functioning in targeted areas measured on an ongoing basis Objective 1C: To ensure continuous access to medical and nonmedical services from point of illness/injury and for as long as needed to secure resilience and stability 58
59 Non-Clinical Program Example (continued) Goal 2: Program Sierra provides media materials and outreach in order to enhance service members knowledge and awareness of the support and services available to assist them with reintegration Objective 2A: To produce and deliver media materials to targeted locations in order to increase awareness of services and supports as indicated by reports from other programs regarding source of referral or knowledge Objective 2B: To increase service use and improve quality by promoting effective support and care services to those who need them 59
60 Non-Clinical Program Example (continued) INPUTS ACTIVITIES OUTPUTS OUTCOMES Target Population Seriously wounded, ill or injured service members and their families Staff 21 including nonmedical care mangers, recovery care managers and military (Division Chief) Stakeholders Service Branch Leadership, Secretary of Defense, Congress Funding Past 5 Fiscal Years $5.5M $1.5M $1.2M $1.2M $800K Care Coordination Administer assessment checklist to determine needs within 7-phase continuum of care Complete comprehensive recovery plans and quarterly progress update Provide consultations and educational material Outreach Develop content for articles, news bulletins, Facebook and website Conduct outreach activities Guidance Sessions Completed Benefits/ entitlements Financial Employment Integrated Disability Evaluation System Referrals of participant, family member, caregiver to: Local resources Other DoD programs Information delivered Access service outreach materials (e.g., downloads, hits) Report of program as source of information by select other programs Short Improved attitudes and confidence Increased use of medical and non-medical services and supports throughout recovery and rehabilitation Increased knowledge of benefits, entitlements, resources and transition services Medium Improved quality of life and stability Reduced delays and gaps in treatment (medical) and support services (non-medical) Increased resilience and retention Successful reintegration into military or civilian life Long Increased force readiness Improved service continuity Improved service quality and reduced costs 60
61 Non-Clinical Program Example (continued) ASSUMPTIONS Care coordination is required for target population to effectively access available services and supports EXTERNAL FACTORS Program is highly political care for wounded service members is a priority issue for President, Congress and senior leaders in the Defense Department and Department of Veterans Affairs There is widespread community support for assisting wounded, ill and injured service members An additional example for a clinical program is provided in DCoE s Program Evaluation Guide (2 nd Edition), Appendix A 61
Collecting, Analyzing and Interpreting Qualitative Data
Collecting, Analyzing and Interpreting Qualitative Data CAPT Armen H. Thoumaian, Ph.D., USPHS Aaron Sawyer, Ph.D. Patrick High, Dr.P.H. Destiny Simone Ramjohn, Ph.D. Debra Stark, M.B.A. July 15, 2014 Webinar
Developing Effective Logic Models to Define a Program CAPT Armen H. Thoumaian, Ph.D., USPHS Debra Stark, M.B.A. Aaron Sawyer, Ph.D.
Developing Effective Logic Models to Define a Program CAPT Armen H. Thoumaian, Ph.D., USPHS Debra Stark, M.B.A. Aaron Sawyer, Ph.D. Defense Centers of Excellence for Psychological Health and Traumatic
2014 GLS Grantee Meeting Service Members, Veterans, and Families Learning Collaborative Additional Resources
2014 GLS Grantee Meeting Service Members, Veterans, and Families Learning Collaborative Additional Resources Below are resources that address suicide prevention for service members, veterans, and their
DEPARTMENT OF THE AIR FORCE PRESENTATION TO THE SUBCOMMITTEE ON MILITARY PERSONNEL COMMITTEE ON ARMED SERVICES UNITED STATES HOUSE OF REPRESENTATIVES
DEPARTMENT OF THE AIR FORCE PRESENTATION TO THE SUBCOMMITTEE ON MILITARY PERSONNEL COMMITTEE ON ARMED SERVICES UNITED STATES HOUSE OF REPRESENTATIVES SUBJECT: SUICIDE PREVENTION STATEMENT OF: LIEUTENANT
Evaluating Breastfeeding Programs and Initiatives
Breastfeeding in Ontario Evaluating Breastfeeding Programs and Initiatives What is evaluation? Evaluation means different things to different people. Fundamentally, evaluation means getting feedback about
Licensed Mental Health Counselors and the Military Health System
Licensed Mental Health Counselors and the Military Health System LT Rick Schobitz, Ph.D., USPHS Deputy Director, Behavioral Medicine Division Office of the Chief Medical Officer TRICARE Management Activity
Department of Defense INSTRUCTION. Counseling Services for DoD Military, Guard and Reserve, Certain Affiliated Personnel, and Their Family Members
Department of Defense INSTRUCTION NUMBER 6490.06 April 21, 2009 Incorporating Change 1, July 21, 2011 USD(P&R) SUBJECT: Counseling Services for DoD Military, Guard and Reserve, Certain Affiliated Personnel,
Finding the Right People for Your Program Evaluation Team: Evaluator and Planning Team Job Descriptions
: Evaluator and Planning Team Job Descriptions I. Overview II. Sample Evaluator Job Description III. Evaluator Competencies IV. Recruiting members of your strategic evaluation planning team V. Recruiting
Military Family Readiness. Ms. Barbara Thompson Director, Office of Family Readiness Policy Office of the Deputy Assistant Secretary of Defense
Military Family Readiness Ms. Barbara Thompson Director, Office of Family Readiness Policy Office of the Deputy Assistant Secretary of Defense Preparing families for military service Identify the definition
Summary of Study and Findings J. Lisa Stewart, Ph.D.
BRIDGING THE GAP BETWEEN ENROLLMENT AND GRADUATION: AN EXPLORATORY STUDY OF RETENTION PRACTICES FOR NONTRADITIONAL UNDERGRADUATE STUDENTS ATTENDING FAITH-BASED INSTITUTIONS Summary of Study and Findings
Provider Orientation Training Webinar
Provider Orientation Training Webinar For Audio Call: (877) 563-4796 Enter Code: 7771224 Please mute your phones by pressing *6 Training Topics Administrative Orientation Welcome and Introductions Overview
VA Program for Suicide Prevention
VA Program for Suicide Prevention Janet Kemp RN, PhD VA National Mental Health Director for Suicide Prevention Office of Mental Health, Patient Care Services Washington DC Krista Stephenson RN, MSN VA
ANALYSIS OF FOCUS GROUP DATA
ANALYSIS OF FOCUS GROUP DATA Focus Groups generate a large amount of data which needs to be organized and processed so that the main ideas are elicited. The first step is transcribing the FGs in a way
Report to the President and Congress Medicaid Home and Community-Based Alternatives to Psychiatric Residential Treatment Facilities Demonstration
Report to the President and Congress Medicaid Home and Community-Based Alternatives to Psychiatric Residential Treatment Facilities Demonstration As Required by the Deficit Reduction Act of 2005 (P.L.
Easing the Transition: Veteran Success is the Mission
Easing the Transition: Veteran Success is the Mission Lisa LaVigna, Executive Director of Outreach Kate Looney, Director of the Veterans Success Center Agenda Introduction Example 1: Excelsior College
Job Description of the School Psychologist Reports To: Supervises: Purpose:
Reports To: Supervises: Purpose: Job Description of the School Psychologist Superintendent, Level II or Level III School Psychologists, Director, or Associate or Assistant Superintendent May supervise
How To Plan An Army Medicine
MEDCOM Comprehensive Behavioral Health System of Care (CBHSOC) School Behavioral Health Program Michael E. Faran, MD, PhD Child, Adolescent and Family Behavioral Health Office HQ, USA MEDCOM 1 AGENDA Overview
Department of Defense INSTRUCTION
Department of Defense INSTRUCTION NUMBER 6490.04 March 4, 2013 USD(P&R) SUBJECT: Mental Health Evaluations of Members of the Military Services References: See Enclosure 1 1. PURPOSE. In accordance with
Using Public Health Evaluation Models to Assess Health IT Implementations
Using Public Health Evaluation Models to Assess Health IT Implementations September 2011 White Paper Prepared for Healthcare Information and Management Systems Society 33 West Monroe Street Suite 1700
Undergraduate Psychology Major Learning Goals and Outcomes i
Undergraduate Psychology Major Learning Goals and Outcomes i Goal 1: Knowledge Base of Psychology Demonstrate familiarity with the major concepts, theoretical perspectives, empirical findings, and historical
How To Be A Successful Supervisor
Quick Guide For Administrators Based on TIP 52 Clinical Supervision and Professional Development of the Substance Abuse Counselor Contents Why a Quick Guide?...2 What Is a TIP?...3 Benefits and Rationale...4
OFFICE OF THE ASSISTANT SECRETARY OF DEFENSE WASHINGTON, DC 20301-1200
OFFICE OF THE ASSISTANT SECRETARY OF DEFENSE WASHINGTON, DC 20301-1200 HEAL11i AffAIRS DEC 1 3 2010 MEMORANDUM FOR SURGEON GENERAL OF THE ARMY SURGEON GENERAL OF THE NAVY SURGEON GENERAL OF THE AIR FORCE
Media Campaigns, Social Marketing, and Social Norms Campaigns: What is the Difference?
Media Campaigns, Social Marketing, and Social Norms Campaigns: What is the Difference? Center for the Application of Prevention Technologies West Resource Team Anne Rogers, M.Ed., CHES CAPT Associate 2
Using Common Software Programs for Qualitative Data Analysis (QDA) Jean N. Scandlyn University of Colorado Denver Health and Behavioral Sciences
Using Common Software Programs for Qualitative Data Analysis (QDA) Jean N. Scandlyn University of Colorado Denver Health and Behavioral Sciences Topics Data display Tagging and retrieving text Coding Sorting
THE HONORABLE WILLIAM WINKENWERDER, JR. M.D., M.B.A. ASSISTANT SECRETARY OF DEFENSE FOR HEALTH AFFAIRS BEFORE THE SUBCOMMITTEE ON MILITARY PERSONNEL
THE HONORABLE WILLIAM WINKENWERDER, JR. M.D., M.B.A. ASSISTANT SECRETARY OF DEFENSE FOR HEALTH AFFAIRS BEFORE THE SUBCOMMITTEE ON MILITARY PERSONNEL ARMED SERVICES COMMITTEE U.S. HOUSE OF REPRESENTATIVES
Department of Defense INSTRUCTION
Department of Defense INSTRUCTION NUMBER 6025.22 January 30, 2015 USD(P&R) SUBJECT: Assistive Technology (AT) for Wounded, Ill, and Injured Service Members References: See Enclosure 1 1. PURPOSE. In accordance
CONDUCTING IN-DEPTH INTERVIEWS: A Guide for Designing and Conducting In-Depth Interviews for Evaluation Input
P ATHFINDER I NTERNATIONAL T OOL S ERIES Monitoring and Evaluation 2 CONDUCTING IN-DEPTH INTERVIEWS: A Guide for Designing and Conducting In-Depth Interviews for Evaluation Input By Carolyn Boyce, MA,
Safe and Healthy Minnesota Students Planning and Evaluation Toolkit
Safe and Healthy Minnesota Students Planning and Evaluation Toolkit AUGUST 2008 Safe and Healthy Minnesota Students Planning and Evaluation Toolkit Tips, techniques, and resources for planning and evaluating
Best Practices in Return to Work For Federal Employees Who Sustain Workplace Injury or Illness: A Guide for Agencies
Best Practices in Return to Work For Federal Employees Who Sustain Workplace Injury or Illness: A Guide for Agencies Introduction The Department of Labor (DOL) initiated a study to identify strategies
Developing an R Series Plan that Incorporates Mixed Methods Research
Developing an R Series Plan that Incorporates Mixed Methods Research / 16 Developing an R Series Plan that Incorporates Mixed Methods Research primary mechanism for obtaining NIH grants to conduct mixed
Department of Defense INSTRUCTION
Department of Defense INSTRUCTION NUMBER 1322.24 October 6, 2011 ASD(HA) SUBJECT: Reference: Medical Readiness Training (a) DoD Directive 5124.02, Under Secretary of Defense for Personnel and Readiness
Key performance indicators
Key performance indicators Winning tips and common challenges Having an effective key performance indicator (KPI) selection and monitoring process is becoming increasingly critical in today s competitive
EVALUATION METHODS TIP SHEET
EVALUATION METHODS TIP SHEET QUANTITATIVE METHODS: Quantitative data collection methods consist of counts or frequencies, rates or percentages, or other statistics that document the actual existence or
Integrating Behavioral Health into Primary Care Innovation Community Webinar #7. June 17, 2015
Integrating Behavioral Health into Primary Care Innovation Community Webinar #7 June 17, 2015 Today s Agenda 1. Updates, Reminders 2. Guest Speaker: Andrea Auxier, PhD Director of Integration, National
Computer Assisted Qualitative Data Analysis Software (CAQDAS) Atlas.ti: Enhancing qualitative data analysis and findings
Computer Assisted Qualitative Data Analysis Software (CAQDAS) Atlas.ti: Enhancing qualitative data analysis and findings by Teri Richter, Researcher In research and particularly evaluation, we aim to make
Data Analysis and Statistical Software Workshop. Ted Kasha, B.S. Kimberly Galt, Pharm.D., Ph.D.(c) May 14, 2009
Data Analysis and Statistical Software Workshop Ted Kasha, B.S. Kimberly Galt, Pharm.D., Ph.D.(c) May 14, 2009 Learning Objectives: Data analysis commonly used today Available data analysis software packages
Welcome to the Los Angeles Veterans Collaborative. Intro Working Group
Welcome to the Los Angeles Veterans Collaborative Intro Working Group BACKGROUND U.S. at war for over a decade 2.8 Million Deployed Drawdown 325,000 vets currently in Los Angeles 12,000 more per year estimated
Federal Recovery Coordination Program. Karen Guice, MD, MPP Executive Director
Federal Recovery Coordination Program Karen Guice, MD, MPP Executive Director CONCEPT The President s Commission on Care for America s Returning Wounded Warriors Immediately create comprehensive patient-centered
Status of Enterprise Resource Planning Systems Cost, Schedule, and Management Actions Taken to Address Prior Recommendations
Report No. DODIG-2013-111 I nspec tor Ge ne ral Department of Defense AUGUST 1, 2013 Status of Enterprise Resource Planning Systems Cost, Schedule, and Management Actions Taken to Address Prior s I N T
Demystifying Data Collection ANA WEBINAR SERIES-MAY 21, 2015
Demystifying Data Collection ANA WEBINAR SERIES-MAY 21, 2015 acf.hhs.gov/ana HelpDesk (877) 922-9262 Eastern Region Training & Technical Assistance Center www.anaeastern.org 1 (888) 221-9686 A Resource
PROGRAM ANNOUNCEMENT HSR&D PRIORITIES FOR INVESTIGATOR-INITIATED RESEARCH
PROGRAM ANNOUNCEMENT HSR&D PRIORITIES FOR INVESTIGATOR-INITIATED RESEARCH [Please note that the research areas listed below do not represent the full depth and breadth of current priorities for the Health
Linking Risk Management to Business Strategy, Processes, Operations and Reporting
Linking Risk Management to Business Strategy, Processes, Operations and Reporting Financial Management Institute of Canada February 17 th, 2010 KPMG LLP Agenda 1. Leading Practice Risk Management Principles
REQUEST FOR PROPOSAL IN NETWORK State Funded Psychosocial Rehabilitation In Cumberland County RFP # 2015-103 June 23, 2015
REQUEST FOR PROPOSAL IN NETWORK State Funded Psychosocial Rehabilitation In Cumberland County RFP # 2015-103 June 23, 2015 NOTE: Alliance reserves the right to modify this RFP to correct any errors or
Stages of Instructional Design V. Professional Development
Stages of Instructional Design V. Professional Development Derived from Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of Instructional Design (4th ed.). Fort Worth, TX: Harcourt Brace
ort Office of the Inspector General Department of Defense YEAR 2000 COMPLIANCE OF THE STANDARD ARMY MAINTENANCE SYSTEM-REHOST Report Number 99-165
it ort YEAR 2000 COMPLIANCE OF THE STANDARD ARMY MAINTENANCE SYSTEM-REHOST Report Number 99-165 May 24, 1999 Office of the Inspector General Department of Defense Additional Copies To obtain additional
SMART Objectives & Logic Modeling. HIV Prevention Section Bureau of HIV/AIDS
SMART Objectives & Logic Modeling HIV Prevention Section Bureau of HIV/AIDS Typing a Question in the Chat Box Type question in here Completing the Webinar Evaluation (opened at end of webinar) SMART Objectives
VETS4WARRIORS. Peer Support for Service Members, Veterans and their families
VETS4WARRIORS Peer Support for Service Members, Veterans and their families Executive note: Many know of the powerful work Vets4Warriors peers provide around the world to service men and women, veterans
Counseling psychologists School psychologists Industrial-organizational psychologists "Psychologist Overview"
Psychologist Overview The Field - Specialty Areas - Preparation - Day in the Life - Earnings - Employment - Career Path Forecast - Professional Organizations The Field Psychologists study the human mind
K-12 EDUCATION Introduction and Capabilities K-12 Education
K-12 EDUCATION Introduction and Capabilities Hanover provides high-quality, timely, and well-articulated services working closely with our staff. Whether working with staff who have significant grant and
Developing an Effective Evaluation Plan. Setting the course for effective program evaluation
Developing an Effective Evaluation Plan Setting the course for effective program evaluation Acknowledgments This workbook was developed by the Centers for Disease Control and Prevention s (CDC s) Office
Evaluating health promotion programs
Evaluating health promotion programs November 1, 2012 V4.12 Five minute exercise in evaluation design In pairs, carefully observe one another Turn your backs to each other Both of you change three things
USAGE OF NVIVO SOFTWARE FOR QUALITATIVE DATA ANALYSIS
USAGE OF NVIVO SOFTWARE FOR QUALITATIVE DATA ANALYSIS Muhammad Azeem Assessment Expert, PEAS University of Education, Lahore PAKISTAN [email protected] Naseer Ahmad Salfi Doctoral Research Fellow
Activity Guide Innovation & Growth Nova Scotia Public Service Commission
TALENT MANAGEMENT Activity Guide This document is protected by copyright. The consent of the copyright owner must be obtained for reproduction. Innovation & Growth Nova Scotia Public Service Commission
Sample Qualitative Research Proposal Published by Permission of the Author. Dissertation Proposal Robert R. Maldonado.
Sample Qualitative Research Proposal Published by Permission of the Author Dissertation Proposal Robert R. Maldonado Akamai University Title: A Phenomenological Pilot Study of Energy Healers Expertise
Chapter 7: Monitoring and Evaluation
Chapter 7: Monitoring and Evaluation The Training Program is relatively new and is still developing. Careful monitoring and evaluation of the Program by the people most closely involved in it will help
BETSY J. DAVIS, Ph.D. 9426 Indian School Rd. NE., Ste. 1 Albuquerque, NM 87112 505-977-1766
BETSY J. DAVIS, Ph.D. 9426 Indian School Rd. NE., Ste. 1 Albuquerque, NM 87112 505-977-1766 EDUCATION Loyola University Chicago Degree: Ph.D. 2001 Major: Counseling Psychology Minor: Applied Psychological
A Strategic Approach to Onboarding Design: Surveys, Materials, & Diverse Hires
Cornell University ILR School DigitalCommons@ILR Student Works ILR Collection Spring 2015 A Strategic Approach to Onboarding Design: Surveys, Materials, & Diverse Hires Maria Grillo Cornell University
Adobe Connect. Virtual Conferences. Foreword. Tactics, Techniques, and Procedures. By:
Adobe Connect s Foreword This whitepaper is based on services EnvolveMEDIA performed with Adobe Connect for the Naval & Environmental Training Safety Center in March of 2013. In a matter of weeks, the
Department of Defense DIRECTIVE
Department of Defense DIRECTIVE NUMBER 7050.06 April 17, 2015 IG DoD SUBJECT: Military Whistleblower Protection References: See Enclosure 1 1. PURPOSE. This directive reissues DoD Directive (DoDD) 7050.06
2016 ALAMO AREA TRAINING INSTITUTE
2016 ALAMO AREA TRAINING INSTITUTE ROOM/DAY CLASSROOM 1 CLASSROOM 2 CLASSROOM 3 CLASSROOM 4 CLASSROOM 5 MONDAY MORNING MONDAY AFTERNOON TUESDAY MORNING TUESDAY AFTERNOON TUESDAY EVENING WEDNESDAY MORNING
LEARNING OUTCOMES FOR THE PSYCHOLOGY MAJOR
LEARNING OUTCOMES FOR THE PSYCHOLOGY MAJOR Goal 1. Knowledge Base of Psychology Demonstrate familiarity with the major concepts, theoretical perspectives, empirical findings, and historical trends in psychology.
Who s Your Supervisor or Manager? Nursing Practice: The Management and Supervision of Nursing Services.
By Carol Walker, RN,MS,FRE Who s Your Supervisor or Manager? Nursing Practice: The Management and Supervision of Nursing Services. Purpose, Objectives, Required Reading and References Purpose: To assist
PRO-NET. A Publication of Building Professional Development Partnerships for Adult Educators Project. April 2001
Management Competencies and Sample Indicators for the Improvement of Adult Education Programs A Publication of Building Professional Development Partnerships for Adult Educators Project PRO-NET April 2001
State Project Evaluation Activities. The Role of the State s Evaluations in the ADRC Program s Success
State Project Evaluation Activities Tuesday October 5 th, 8:30-10:00 Moderator: Karen Linkins, Ph.D., Co-Director, ADRC Technical Assistance Exchange, The Lewin Group The Role of the State s Evaluations
VA BENEFITS FOR SERVICEMEMBERS, VETERANS, AND FAMILIES. Veterans and the Fully Developed Claims (FDC) Program VETERANS BENEFITS ADMINISTRATION
VA BENEFITS FOR SERVICEMEMBERS, VETERANS, AND FAMILIES Veterans and the Fully Developed Claims (FDC) Program SUMMER 2013 Asking Questions Audience polls, questions and answers, and chat broadcasts will
How To Be A Health Care Provider
Program Competency & Learning Objectives Rubric (Student Version) Program Competency #1 Prepare Community Data for Public Health Analyses and Assessments - Student 1A1. Identifies the health status of
Analysis of VA Health Care Utilization among Operation Enduring Freedom (OEF), Operation Iraqi Freedom (OIF), and Operation New Dawn (OND) Veterans
Analysis of VA Health Care Utilization among Operation Enduring Freedom (OEF), Operation Iraqi Freedom (OIF), and Operation New Dawn (OND) Veterans Cumulative from 1 st Qtr FY 2002 through 4th Qtr FY 2011
By Jack Phillips and Patti Phillips How to measure the return on your HR investment
By Jack Phillips and Patti Phillips How to measure the return on your HR investment Using ROI to demonstrate your business impact The demand for HR s accountability through measurement continues to increase.
Identifying Risk Factors for Suicide and Barriers to Behavioral Health Care in Military Populations Using Focus Groups
Identifying Risk Factors for Suicide and Barriers to Behavioral Health Care in Military Populations Using Focus Groups Christine Lagana-Riordan, PhD, LCSW-C Shelley Schmissrauter, MPH Alexis Bender, PhD
Subj: NAVY MEDICAL DEPARTMENT HEALTH PROMOTION AND WELLNESS PROGRAM
+J T () f. DEPARTMENT OF THE NAVY BUREAU OF MEDICINE AND SURGERY 2300 E STREET NW WASHINGTON DC 20372-5300. `'ta may' I N REPLY REFER TO BUMEDINST 6110.13A BUMED-M3 BUMED INSTRUCTION 6110.13A From: Chief,
Veterans have been served by the various Collaborative Court programs which follow evidence based practices for 16 years
Orange County Veterans Treatment Court Community Court Superior Court of California 909 N. Main Street Santa Ana, CA 92701 1 MISSION STATEMENT The mission of the Orange County Veterans Treatment Court
Community Residential Rehabilitation Host Home. VBH-PA Practice Standards
Community Residential Rehabilitation Host Home VBH-PA Practice Standards Community Residential Rehabilitation (CRR) Host Homes are child treatment programs that are licensed under Chapters 5310, 3860 and
DoD CIVILIAN LEADER DEVELOPMENT FRAMEWORK COMPETENCY DEFINITIONS. Leading Change
DoD CIVILIAN LEADER DEVELOPMENT FRAMEWORK COMPETENCY DEFINITIONS Leading Change Definition: This core competency involves the ability to bring about strategic change, both within and outside the organization,
Department of Psychology
The University of Texas at San Antonio 1 Department of Psychology The Department of Psychology offers the Master of Science Degree in Psychology and the Doctor of Philosophy Degree in Psychology. Master
Canadian Association of Occupational Therapists. Senate Subcommittee on Veterans Affairs
Canadian Association of Occupational Therapists Submission to the Senate Subcommittee on Veterans Affairs Occupational Therapy: Supporting successful transitions to civilian life Wednesday, October 24,
In today s acquisition environment,
4 The Challenges of Being Agile in DoD William Broadus In today s acquisition environment, it no longer is unusual for your program to award a product or service development contract in which the vendor
Improvements Needed for Triannual Review Process at Norfolk Ship Support Activity
Report No. DODIG-2014-070 I nspec tor Ge ne ral U.S. Department of Defense M AY 6, 2 0 1 4 Improvements Needed for Triannual Review Process at Norfolk Ship Support Activity I N T E G R I T Y E F F I C
Introduction to Veteran Treatment Court
Justice for Vets Veterans Treatment Court Planning Initiative Introduction to Veteran Treatment Court Developed by: Justice for Vets Justice for Vets, 10 February 2015 The following presentation may not
Stakeholder Guide 2014 www.effectivehealthcare.ahrq.gov
Stakeholder Guide 2014 www.effectivehealthcare.ahrq.gov AHRQ Publication No. 14-EHC010-EF Replaces Publication No. 11-EHC069-EF February 2014 Effective Health Care Program Stakeholder Guide Contents Introduction...1
Plan for Executing Senior Manager Focus Groups Regarding the Independent Review of the Defense Civilian Intelligence Personnel System
Plan for Executing Senior Manager Focus Groups Regarding the Independent Review of the Defense Civilian Intelligence Personnel System February 18, 2010 Background The National Academy of Public Administration
DoD Methodologies to Identify Improper Payments in the Military Health Benefits and Commercial Pay Programs Need Improvement
Report No. DODIG-2015-068 I nspec tor Ge ne ral U.S. Department of Defense JA N UA RY 1 4, 2 0 1 5 DoD Methodologies to Identify Improper Payments in the Military Health Benefits and Commercial Pay Programs
Navy s Contract/Vendor Pay Process Was Not Auditable
Inspector General U.S. Department of Defense Report No. DODIG-2015-142 JULY 1, 2015 Navy s Contract/Vendor Pay Process Was Not Auditable INTEGRITY EFFICIENCY ACCOUNTABILITY EXCELLENCE INTEGRITY EFFICIENCY
Encouraging Effective Performance Management Systems
Encouraging Effective Performance Management Systems Alison Carr, PhD candidate, Shaker Consulting Group Kelsey Kline, PhD, Organizational Development and Talent Management Consultant A White Paper prepared
Chapter 2 Essential Skills for Case Managers
Chapter 2 Essential Skills for Case Managers 2.1 Essential Skill Overview If you ask ten people what case management means to them, you will most likely get ten different answers. Though case management
