Collecting, Analyzing and Interpreting Qualitative Data

Similar documents
Analyzing Program Evaluation Data: How to Interpret Qualitative Data

Developing Effective Logic Models to Define a Program CAPT Armen H. Thoumaian, Ph.D., USPHS Debra Stark, M.B.A. Aaron Sawyer, Ph.D.

CONDUCTING IN-DEPTH INTERVIEWS: A Guide for Designing and Conducting In-Depth Interviews for Evaluation Input

Using Qualitative & Quantitative Research Methods to Answer your Research Questions

Finding the Right People for Your Program Evaluation Team: Evaluator and Planning Team Job Descriptions

DEPARTMENT OF THE AIR FORCE PRESENTATION TO THE SUBCOMMITTEE ON MILITARY PERSONNEL COMMITTEE ON ARMED SERVICES UNITED STATES HOUSE OF REPRESENTATIVES

Analyzing and interpreting data Evaluation resources from Wilder Research

EVALUATION METHODS TIP SHEET

ISPA School Psych Skills Model

Licensed Mental Health Counselors and the Military Health System

Undergraduate Psychology Major Learning Goals and Outcomes i

Computer Assisted Qualitative Data Analysis Software (CAQDAS) Atlas.ti: Enhancing qualitative data analysis and findings

Summary of the State Elder Abuse. Questionnaire for Ohio

Evaluation: Designs and Approaches

Developing an R Series Plan that Incorporates Mixed Methods Research

Using Mixed Methods Research to Analyze Surveys. Keith Wurtz Senior Research Analyst Chaffey College

Identifying Risk Factors for Suicide and Barriers to Behavioral Health Care in Military Populations Using Focus Groups

How to Develop a Research Protocol

How To Be A Successful Supervisor

Principles of Qualitative Research: Designing a Qualitative Study

Safe and Healthy Minnesota Students Planning and Evaluation Toolkit

Standards for Certification in Early Childhood Education [ ]

Qualitative methods for effectiveness evaluation: When numbers are not enough

Using Focus Groups in Program Development and Evaluation

Portfolio Guidelines: Practicum Year Northeastern University's CAGS Program in School Psychology* Revised May 2014

Job Description of the School Psychologist Reports To: Supervises: Purpose:

Northwestern Michigan College Supervisor: Employee: Department: Office of Research, Planning and Effectiveness

College of Arts and Sciences: Social Science and Humanities Outcomes

2014 GLS Grantee Meeting Service Members, Veterans, and Families Learning Collaborative Additional Resources

Professional Level Public Health Informatician

PRACTICUM HANDBOOK Community and College Student Development. The College of Education & Human Development UNIVERSITY OF MINNESOTA

CHAPTER THREE: METHODOLOGY Introduction. emerging markets can successfully organize activities related to event marketing.

A Comprehensive Model for Assessing Service-Learning and Community- University Partnerships

Summary of the State Elder Abuse. Questionnaire for West Virginia

Australian ssociation

Improving Family Outcomes Using Treatment Engagement Strategies

Department of Defense INSTRUCTION

What are research, evaluation and audit?

Use advanced techniques for summary and visualization of complex data for exploratory analysis and presentation.

Section Two: Ohio Standards for the Teaching Profession

The P ProcessTM. 2 Design Strategy. Five Steps to Strategic Communication. Mobilize & Monitor. Create & Test. Evaluate & Evolve. Inquire T H E O RY

Target Audience: Special Education Teachers, Related Service Providers, School Psychologists.

NATIONAL COUNCIL for Behavioral Health

Sample Qualitative Research Proposal Published by Permission of the Author. Dissertation Proposal Robert R. Maldonado.

NBPHE Job Task Analysis

Characteristics of Effective and Sustainable Teaching Development Programs for Quality Teaching in Higher Education

Using qualitative research to explore women s responses

Using Evaluation to Improve Programs. Strategic Planning.

Summary of Study and Findings J. Lisa Stewart, Ph.D.

Data Analysis and Statistical Software Workshop. Ted Kasha, B.S. Kimberly Galt, Pharm.D., Ph.D.(c) May 14, 2009

Guide to the Cover Letter

Qualitative Interview Design: A Practical Guide for Novice Investigators

Veterans Guide. Helpful Career Tips for Those with United States Military Experience

Evaluating health promotion programs

Research Methods: Qualitative Approach

Chapter 17 Qualitative Data Analysis

Evaluating Breastfeeding Programs and Initiatives

Introduction to Research Methods and Applied Data Analysis Online module

MARKETING PhD. What is the focus of a Marketing PhD?

Conducting Interviews in Qualitative Social Science Research

Lisa C. Tang, Ph.D. Licensed Clinical Psychologist 91 W Neal St. Pleasanton, CA (925)

Running head: FROM IN-PERSON TO ONLINE 1. From In-Person to Online : Designing a Professional Development Experience for Teachers. Jennifer N.

Whether and How to Apply to Ph.D. Programs in Political Science: FAQs

Inter-Rater Agreement in Analysis of Open-Ended Responses: Lessons from a Mixed Methods Study of Principals 1

Differences between qualitative and quantitative research methods

Qualitative Data Collection and Analysis

Assessing Research Protocols: Primary Data Collection By: Maude Laberge, PhD

Summary of the State Elder Abuse. Questionnaire for Vermont

Employment after Traumatic Brain Injury. Living with Brain Injury

CRITICALLY APPRAISED PAPER (CAP)

A Review of Teacher Induction in Special Education: Research, Practice, and Technology Solutions

How To Collect Data From A Large Group

BUSINESS CONSULTING SERVICES Comprehensive practice management solutions for independent investment advisors

Developing an Effective Evaluation Report. Setting the course for effective program evaluation

Master of Public Health (MPH) SC 542

Using Common Software Programs for Qualitative Data Analysis (QDA) Jean N. Scandlyn University of Colorado Denver Health and Behavioral Sciences

Report on Moravian MBA Program Learning Objectives Related to Communication Behaviors of Students. Data Sources and Instrument Information

PREPARING A CASE STUDY: A Guide for Designing and Conducting a Case Study for Evaluation Input

Qualitative and Quantitative research: a comparison and combination

Family Involvement in Adolescent Substance Abuse Treatment February, 2008

Department of Defense INSTRUCTION. Counseling Services for DoD Military, Guard and Reserve, Certain Affiliated Personnel, and Their Family Members

Focus groups stimulating and rewarding cooperation between the library and its patrons

How To Get A Job At A Health Care Facility

CHAPTER III METHODOLOGY. The purpose of this study was to describe which aspects of course design

Methods in Case Study Analysis

NON-PROBABILITY SAMPLING TECHNIQUES

Alan Rodgers Advanced Research 8102 Research Proposal. Proposal to Evaluate the Performance Appraisals. At the Ability Building Center

Address Co-Principal Investigator Information Co- Principal Investigator Faculty Staff Grad. Student Undergrad. Student

Step 1: Analyze Data. 1.1 Organize

Transcription:

Collecting, Analyzing and Interpreting Qualitative Data CAPT Armen H. Thoumaian, Ph.D., USPHS Aaron Sawyer, Ph.D. Patrick High, Dr.P.H. Destiny Simone Ramjohn, Ph.D. Debra Stark, M.B.A. July 15, 2014

Webinar Details This webinar presentation has been pre-recorded A live question-and-answer session will be held at the conclusion of the presentation Questions may be submitted via the Question pod Audio for this presentation will be provided through Adobe Connect; there is no separate dial-in Closed captioning is not available for this event 2

Continuing Education Details Continuing education credit is not available for this event Sources for materials and additional training information: Materials from this series are available at: dcoe.mil/about_dcoe/program_evaluation.aspx For information on other DCoE webinar and training series, visit: dcoe.mil/training/monthly_webinars.aspx Materials for this webinar are available in the Files box 3

Presenters CAPT Armen Thoumaian, Ph.D. U.S. Public Health Service Acting Deputy Chief of Integration Office of Policy, Programs and Integration, DCoE CAPT Armen Thoumaian is a scientist director in the Commissioned Corps of the U.S. Public Health Service with more than 30 years experience in health and mental health program design and evaluation. In January 2012, CAPT Thoumaian joined the staff at the Defense Centers of Excellence for Psychological Health and Traumatic Brain Injury (DCoE) to help design and implement program evaluation and improvement efforts in the Defense Department. USPHS Capt. Armen Thoumaian, Ph.D. He holds a B.A. in Psychology and Sociology, an M.A. in General Experimental Psychology, and a Ph.D. in Social Welfare and Social Work, completing an National Institute of Mental Health fellowship in Community Mental Health. 4

Presenters Destiny Simone Ramjohn, Ph.D. Research Scientist, Contract Support for DCoE Dr. Destiny Simone Ramjohn is a medical sociologist and qualitative researcher with over seven years of program evaluation experience. She has worked extensively to examine the contextual and situational factors associated with health behaviors and methodological challenges in intervention research in domestic and international communities. Dr. Ramjohn has served as a consultant with the Defense Department, Centers for Disease Control and Prevention, New York State Health Foundation and The Commonwealth Fund. She has taught numerous courses on employing qualitative methods within public health evaluation, education and practice. Dr. Ramjohn earned her doctorate in Sociomedical Sciences from Columbia University. Dr. Destiny Simone Ramjohn Debra Stark, M.B.A. Research Scientist, Contract Support for DCoE Ms. Debra Stark is a survey methodologist with 15-plus years of research experience. Her work includes program evaluation and monitoring, qualitative data analysis and survey instrument design. She has worked on public health services evaluation projects with various federal agencies, including the Department of Veterans Affairs and TRICARE Management Activity. Ms. Stark received an M.B.A. from Vanderbilt University. Ms. Debra Stark 5

Moderator and Presenter Aaron Sawyer, Ph.D. Research Scientist, Contract Support for DCoE Dr. Aaron Sawyer is a clinical psychologist with extensive expertise in intervention outcome research and program evaluation. He has delivered child, family and adult interventions for more than a decade, including specialization in trauma and experience working with military families. Dr. Sawyer holds an M.S. in Experimental Psychology and a Ph.D. in Clinical Psychology. He completed post-doctoral training at The Kennedy Krieger Institute/Johns Hopkins University and is a licensed psychologist. Patrick High, Dr.P.H. Epidemiologist, Contract Support for DCoE Dr. Patrick High is an epidemiologist with over a decade of experience and has expertise in survey design, research methodology and program evaluation. His experience includes supporting the Office of the Undersecretary of Defense for Personnel and Readiness, Operations Research and Safety, and the Defense Suicide Prevention Office as an epidemiologist. Dr. High holds the degree of doctor of public health with specialization in Epidemiology and Biostatistics from the Uniformed Services University of the Health Sciences. He previously spent nine years in the Illinois Army National Guard. Dr. Aaron Sawyer Dr. Patrick High 6

Overview and Objectives This training presentation will provide an introduction to collecting, coding, analyzing and understanding qualitative data. Qualitative data include information derived from interviews, observation, focus groups and written feedback or comments. At the conclusion of this webinar, participants will be able to: Explain how qualitative data can be used in support of program evaluation and improvement efforts Demonstrate knowledge of important considerations for collecting and coding qualitative data Implement suggested guidance to begin analyzing qualitative data and integrating with quantitative data Identify common challenges that programs face when using qualitative data and also identify resources for technical support 7

Agenda Introduction to Qualitative Methods Collecting Qualitative Data Analyzing and Interpreting Qualitative Data Reporting Qualitative Data Common Challenges Conclusion Resources and References Feedback and Q&A Session 8

Introduction to Qualitative Methods

What Counts in Program Evaluation? Not everything that can be counted counts, and not everything that counts can be counted. -William Bruce Cameron Image Source: Hubble Heritage 10

What Are Qualitative Methods? Qualitative methods are forms of data collection and analysis based on textual or non-numerical information You are likely already engaged in qualitative methods: Logic model development Notes about program participants Meeting minutes Staff feedback 11

Advantages of Qualitative Methods Qualitative methods have distinct advantages for: Understanding meaning Understanding context Understanding process Identifying unknown or unanticipated phenomena Developing causal explanations (Patton, 2001) 12

Example: How Can Qualitative Methods Add to My Program Evaluation? A new program manager is interested in finding out how to improve a program due to declining participation rates and poor satisfaction ratings from participants (78% report they are dissatisfied or very dissatisfied ) Qualitative methods can be used to explore: Why are program participants dissatisfied? What are the barriers to program participation? How do participants think the program s services can be improved? 13

Key Differences Between Qualitative and Quantitative Methods Qualitative Quantitative Data are more specific to participants Data can often be applied to broader population Answers: Why? How? Data are generally text-based Answers: How many? Who? When? Where? Data are generally number-based Data collection and analysis are generally time-intensive Data collection tools are flexible Data collection and analysis are generally efficient Data collection tools are fixed 14

Establishing Validity in Qualitative Methods Term Definition Evaluation Tactic Credibility Transferability Dependability Confirmability Extent to which data fit views of the participants or whether the findings hold true Extent to which findings are applicable to other populations and settings Extent to which data collection and analysis processes are logical and repeatable Extent to which data support the findings Check interpretations with participants Provide information about participants to readers Document all processes and explain basis for them Use multiple evaluators and examine potential biases 15

No Single Method Is Superior Just as no single treatment/program design can solve complex social problems, no single evaluation method can fully explain a program Qualitative evaluation methods provide a more complete picture than quantitative methods alone, especially with regard to program processes and participant experiences Qualitative methods help us understand the richness and complexity of psychological health and traumatic brain injury programs 16

Mixed Methods Can Produce More Complete Findings Mixed methods combine the benefits of both qualitative and quantitative methods and can: Assess size and frequency and explore meaning and understanding Answer multiple evaluation questions using tailored methods (e.g., focus groups and statistical analyses) 17

Mixed Methods Can Produce More Complete Findings (continued) Example Question Methods Results Does the program meet the needs of its target population? Quantitative Qualitative Participation rates show that 85% of the identified population is participating In a survey, only 35% of participants reported that their needs were met Focus group participants report specific areas of unmet needs as well as needs the program had not considered 18

Sample Mixed Method Designs Parallel Designs in which quantitative and qualitative data are gathered at the same time and results are merged Sequential Designs in which one data set builds on another Embedded/Nested One design incorporates aspects of the other design (e.g., a rating form that includes open-ended questions) 19

Collecting Qualitative Data

Choose Data Collection Methods Based on Evaluation Questions Select qualitative data collection methods based on the specific question(s) to be addressed Evaluations often contain multiple questions of interest (e.g., satisfaction, barriers, quality of implementation) Be sure to keep evaluation questions simple and focused Note that data collection may lead to additional evaluation questions as new information is gained 21

Ethics and Confidentiality Inform participants of the potential risks of participation Clearly state to participants: How data will be collected (e.g., recordings, written) Who will participate (e.g., rank, job roles) Who will have access to data How confidentiality will be maintained How data will be stored/destroyed 22

Qualitative Data Collection Methods TITLE DESCRIPTION CHARACTERISTICS Focus Groups Group conversation facilitated by moderator Use structured protocol, groupings of similar individuals Interviews One-on-one conversation Can be structured or semistructured Open-ended Comments Free-text response on feedback forms or surveys Voluntary expression Observation Log or description of activity Applied in consistent manner to minimize bias After Action Reviews (AARs) Group review following activity Focus on strengths and opportunities for improvement Case Studies In-depth observations over time Study of one individual, process or program 23

Focus Groups Moderator 6-10 participants Discussion guide Meeting room Recording method Requires an atmosphere structured to encourage interactive discussion - Safe and permissive - - Preserves integrity and dignity - The discussion is the data 24

Interviews Most appropriate when you need in-depth information from an individual very familiar with the topic Work well for sensitive or complex issues Similar to focus groups, interviews use structured or semi-structured discussion guide Call for flexible, active guidance on the part of the interviewer, since there is no group dynamic 25

Open-Ended Comments Often found on feedback forms or open-ended survey questions Voluntary nature of data make them more compelling May not get information on topic of interest May not get a range of opinions 26

Observation Best when there is a need to understand behavior in context (i.e., How people actually behave in realistic settings) Does not rely on attitudes and self-report Often uses a checklist to ensure uniform data collection May be difficult to collect quotes The act of observation can influence the behavior of those being observed 27

After Action Reviews Post-activity hotwash: Group meets to discuss impressions of how activity actually occurred in real-time Summarize and discuss most important points Generally focused on strengths and weaknesses Often time-limited/constrained Meeting notes serve as basis for reports/analysis 28

Case Studies Examine what happens to a person or group over the course of time Allow for in-depth, detailed account of important experiences of a specific person Generally portray the story of someone who represents the population 29

Analyzing and Interpreting Qualitative Data

Digging for Riches in the Data Qualitative evaluation findings provide rich information by: Illuminating stories behind the numbers Producing in-depth information difficult to extract through quantitative methods Yielding explanations for unexpected findings from quantitative studies Suggesting additional questions for quantitative evaluation 31

The Qualitative Data Analysis Process Organize Read and interpret data Develop initial coding themes Reduce Create a codebook Apply codes to data Check reliability REDUCE Describe Create visual display Communicate results 32

Organize: Read and Interpret Data Read the data to explore the range, depth, and diversity of information collected Interpreting is the ability to think abstractly, see patterns in the data There are multiple ways to read the data: Literal reading focuses on actual content as-recorded, including grammar, structure, and content Interpretive reading makes sense of participant statements Reflexive reading examines the evaluator s role in collecting the information 33

Organize: Develop Initial Coding Themes A code is simply a way of classifying data into meaningful, relevant categories Take notes to form the foundation for your analysis including: Mental notes to pursue an issue further Thoughts about what a participant was really saying Hypotheses that might explain a puzzling observation Keep the purpose of your codes in mind codes should always be guided by your evaluation questions! 34

Reduce: Create a Codebook A codebook maps the relationship between the raw data, themes and key questions guiding your evaluation Codebooks should include code names or labels, definitions, and inclusion and exclusion criteria: Description of when to use a code Description of when NOT to use a code Examples of correct application of a code 35

Example Codebook Entry Code Name Code Definition Inclusion Exclusion Example Text Stigma Service member descriptions of the stigma that exists in the Military Health System Apply to all instances of seeking help for mental health Do not apply for civilian health system I m afraid I might lose my security clearance if I seek help. Positive experiences Service member descriptions of their prior positive experiences with health care providers Apply to favorable experiences specific to health care Do not apply to negative experiences or non-health care I know my doc is gonna take good care of me. Negative experiences Service member descriptions of their prior negative experiences with health care providers Apply to unfavorable experiences specific to health care Do not apply to positive experiences or non-health care I trusted my doc and then he ratted me out to my CO [commanding officer]. 36

Reduce: Apply Codes to Data Apply codes to data by reading and re-reading data until no new themes emerge Codes may be refined, expanded or eliminated throughout this process Clarify contrasts and comparisons New patterns may emerge in the data, even at the latest stage Establish credibility by linking data to codes Document quotes from multiple participants that support the evaluator s interpretations 37

Reduce: Check Reliability Inter-coder agreement is the extent to which independent coders evaluate data (e.g., blocks of text) and reach the same conclusion Ideally, two or more people code the data and compare how they applied codes: Do two coders working separately agree on the definitions? Do they apply the codes in the same way? 38

Describe: Create a Visual Display When appropriate, use diagrams to show how something works or to clarify relationships between parts of a whole Unit Members Direct Experience Leadership Stigma Negative Experiences Others Experience Family Career Concerns Careseeking Media Depictions For example, a network diagram shows links between categories, variables or events over time 39

Describe: Communicate Results Keep in mind your audience may be unfamiliar with qualitative methods Clearly describe: Participant recruitment Participant characteristics Data collection and analysis procedures Reasoning behind conclusions Be mindful of confidentiality as the sources of qualitative data are often easier to identify than numerical data 40

Reporting Qualitative Data

Purpose of Reporting Demonstrate importance and benefits of the program Provide accountability to funding sources and other stakeholders Generate additional support and buy-in for the program Inform stakeholders about plans to improve quality and outcomes 42

Reporting Qualitative Data When presenting the results of evaluations, it is important to remember both qualitative and quantitative data and findings are typically reported together: Present data using graphs, tables, diagrams and key quotes Interpret findings and draw conclusions Support all conclusions with evidence through clear, consistent use of data (e.g., numbers, quotes) Identify implications for policy and practice 43

Budget Example: Integrating Quantitative and Qualitative Information Category Budgeted Awarded/ Funded Spent Staff $733,696 $733,696 $707,472 Supplies $0 $0 $0 An interview with program administrator revealed: Program spent less money on staff because staff member left and replaced by more junior individual at lower cost Program does not budget for supplies because it competes for end-of-year funds (e.g., surplus money) to provide supplies 44

Program Intent Example: Integrating Quantitative and Qualitative Information Evaluation Question: Are staff and stakeholders aware of the program s mission, goals and objectives? Program manager gave a Yes response on checklist Qualitative methods revealed additional information: Interviews conducted with stakeholders indicated that not all stakeholders were aware of the program s mission and goals A focus group conducted with program staff indicated that not all staff were aware of the program s specific goals 45

Sample Format for Written Report Executive Summary Program Overview Mission, goals and objectives Inputs and activities Outputs and outcomes Program Evaluation Methods Results and Conclusions References Appendices 46

Common Challenges

Special Considerations for Conducting Analyses of Military Programs Service members may not be forthcoming in groups that include individuals with higher rank Special care must be taken to protect participant confidentiality given common concerns related to career trajectories and discrimination It may be difficult to secure volunteers to participate in qualitative data collection activities due to time constraints Image Source: California National Guard Stakeholders are likely to be unfamiliar with qualitative methods and to have biases against their use (e.g., these methods do not offer added value) 48

Threats to Validity It is especially important when using qualitative methods to use practices that maximize data validity: Select participants representative of population Focus on common themes over infrequent responses Express data in participants words Document data collection and analysis procedures Check interpretations with participants If possible, compare interpretations across multiple evaluators 49

Common Challenges FAQ My staff lack the resources, such as time, training and materials to collect and analyze qualitative data. How do I handle intense emotional responses that may occur during qualitative data collection? How can I use qualitative methods to improve my program? 50

My Staff Lack the Resources, Such as Time, Training and Materials, to Collect and Analyze Qualitative Data Program evaluation, whether carried out through qualitative and/or quantitative methods, is an important investment in a program s future Qualitative methods do not need to be overly complex or timeconsuming to be of benefit (e.g., comment cards, annual focus groups, observation) Over time, program evaluation results may be used to identify critical processes and eliminate or streamline others Many materials and training opportunities are free or low-cost, and consultation may be readily available from colleagues or researchers 51

How Do I Handle Intense Emotional Responses That May Occur During Qualitative Data Collection? Staff involved in the data collection processes may encounter statements about: Medical, psychological health, traumatic brain injury issues Suicidal/homicidal thoughts, sadness, anger, intense frustration Child or spousal abuse, relationship problems Develop standard operating procedures based on applicable regulations for how to handle concerning statements including: If/when confidentiality may be broken How to proceed in an emergency situation When and to whom reports of concerns should be made Referral resources for participants and procedures for distribution 52

How Can I Use Qualitative Methods to Improve My Program? Qualitative methods may be used to: Understand participants experiences with staff and services Improve fit between program and its context or population Overcome barriers to participation, low satisfaction, poor results Gather feedback from staff and stakeholders about potential improvements Real-life examples in military programs: Provision of services by unit-embedded providers to enhance program participation Modifications of program language to reduce stigma (e.g., customer vs. patient) Identification of unintended barriers to future help-seeking resulting from programs focused on improving resilience 53

Conclusion

Key Takeaways Qualitative data can provide a rich source of information about how a program operates and how it affects participants Qualitative and quantitative data are complementary in that they both have unique strengths Qualitative data are especially useful in designing program improvements Photo by: Stewart Leiwakabessy 55

Resources DCoE Program Evaluation Guide: http://www.dcoe.mil/content/navigation/documents/dcoe_program_evaluation_guide.pdf U.S. Army Public Health Command, Behavioral and Social Health Outcomes Program (BSHOP): http://phc.amedd.army.mil/topics/healthsurv/bhe/pages/behavioralandsocialhealthoutcomesprogram%2 8BSHOP%29Services.aspx Medicine Sans Frontiers (Doctors Without Borders): http://fieldresearch.msf.org/msf/bitstream/10144/84230/1/qualitative%20research%20methodology.pdf Qual Page, University of Georgia: http://www.qualitativeresearch.uga.edu/qualpage/index.html University of Kentucky Extension: http://www2.ca.uky.edu/agpsd/focus.pdf Michigan Public Health Training Center: http://miphtcdev.web.itd.umich.edu/trainings/courses/community-based-participatory-researchpartnership-approach-public-health-downloadable 56

Resources (continued) Centers for Disease Control and Prevention: http://www.cdc.gov/eval/index.htm\ National Network of Libraries of Medicine: http://nnlm.gov/evaluation/guides.html The Community Tool Box, University of Kansas: http://ctb.ku.edu/en Minnesota Department of Health: http://www.health.state.mn.us/divs/opi/qi/toolbox Deployment Health Clinical Center: http://www.pdhealth.mil/clinicians/assessment_tools.asp Defense and Veterans Brain Injury Center: http://dvbic.dcoe.mil/diagnosis-assessment?audience[0]=3 National Center for Telehealth and Technology: http://www.t2.health.mil/ 57

References Bamberger, M. (2012). Introduction to mixed methods in impact evaluation. The Rockefeller Foundation. Retrieved June 30, 2014 from: http://www.interaction.org/sites/default/files/mixed%20methods%20in%20impact%20evaluation %20%28English%29.pdf Creswell, J.W., Klassen, A.C., Plano Clark, V.L., & Clegg Smith, K. (2010). Best practices for mixed methods research in the health sciences. Office of Behavioral and Social Sciences Research, National Institutes of Health. Retrieved June 30, 2014 from: http://obssr.od.nih.gov/mixed_methods_research/ Driscolle, D.L., Appiah-Yeboah, A., Salib, P., & Rupert, D.J. Merging qualitative and quantitative data in mixed methods research: How to and why not. Ecological and Environmental Anthropology, 3, 19-28. Retrieved June 30 from: http://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1012&context=icwdmeea Patton, M.Q. (2001). Qualitative research and evaluation methods (3 rd ed.) Thousand Oaks, CA: SAGE. United States Agency for International Development (2013). Technical note: Conducting mixed-method evaluations. Retrieved from: http://www.usaid.gov/sites/default/files/documents/1870/mixed_methods_evaluations_technical _Note.pdf 58