Codebook Development for Team-Based Qualitative Analysis
|
|
|
- Gilbert Berry
- 10 years ago
- Views:
Transcription
1 Codebook Development for Team-Based Qualitative Analysis Kathleen M. MacQueen Centers for Disease Control and Prevention (CDC), Atlanta Eleanor McLellan TRW, Inc., Atlanta Kelly Kay and Bobby Milstein CDC, Atlanta Cultural Anthropology Methods 10(2):31-36 One of the key elements in qualitative data analysis is the systematic coding of text (Strauss and Corbin 1990:57 60; Miles and Huberman 1994:56). Codes are the building blocks for theory or model building and the foundation on which the analyst s arguments rest. Implicitly or explicitly, they embody the assumptions underlying the analysis. Given the context of the interdisciplinary nature of research at the Centers for Disease Control and Prevention (CDC), we have sought to develop explicit guidelines for all aspects of qualitative data analysis, including codebook development. On the one hand, we must often explain basic methods such as this in clear terms to a wide range of scientists who have little or no experience with qualitative research and who may express a deep skepticism of the validity of our results. On the other, our codebook development strategy must be responsive to the teamwork approach that typifies the projects we undertake at CDC, where coding is generally done by two or more persons who may be located at widely dispersed sites. We generally use multiple coders so that we can assess the reliability and validity of the coded data through intercoder agreement measures (e.g., Carey et al. 1996) and, for some projects, as the only reasonable way to handle the sheer volume of data generated. The standardized structure and dynamic process used in our codebook development strategy reflects these concerns. This paper describes (1) how a structured codebook provides a stable frame for the dynamic analysis of textual data; (2) how specific codebook features can improve intercoder agreement among multiple researchers; and (3) the value of team-based codebook development and coding. Origins of the Codebook Format Our codebook format evolved over the course of several years and a variety of projects. The conceptual origins took shape in 1993 during work on the CDC-funded Prevention of HIV in Women and Infants Project (WIDP) (Cotton et al. 1998), which generated approximately 600 transcribed semistructured interviews. One research question pursued was whether women s narratives about their own heterosexual behavior could help us understand general processes of change in condom use behavior (Milstein et al. 1998). The researchers decided to use the processes of change (POC) constructs from the Transtheoretical Model (Prochaska 1984; DiClemente and Prochaska 1985) as a framework for the text analysis. However, the validity of the POC constructs for condom-use behavior was unknown, and a credible and rigorous text coding strategy was needed to establish their applicability and relevance for this context. To do this, the analysts had to synthesize all that was known about each POC construct, define what it was, what it was not, and, most importantly, learn how to recognize one in natural language. Several years earlier, O Connell (1989) had confronted a similar problem while examining POCs in transcripts of psychotherapy sessions. Recognizing that "coding processes of change often requires that the coder infer from the statement and its context what the intention of the speaker was," O Connell (1989:106) developed a coding manual that included a section for each code titled "Differentiating (blank) from Other Processes." Milstein and colleagues used O Connell s "differentiation" section in
2 a modified format in their analysis of condom behavior change narratives. They conceptualized the "differentiation" component as "exclusion criteria," which complemented the standard code definitions (which then became known as "inclusion criteria"). To facilitate on-line coding with the software program Tally (Bowyer 1991; Trotter 1993), components were added for the code mnemonic and a brief definition, as well as illustrative examples. Thus, the final version of the analysis codebook contained five parts: the code mnemonic, a brief definition, a full definition of inclusion criteria, a full definition of exclusion criteria to explain how the code differed from others, and example passages that illustrated how the code concept might appear in natural language. During the code application phase, information in each of these sections was supplemented and clarified (often with citations and detailed descriptions of earlier work), but the basic structure of the codebook guidelines remained stable. A similar codebook format was used in an analysis of brief open-ended responses to a question about perceived exposure to HIV, which was included in a standardized survey of 1,950 gay and bisexual men enrolled in the CDC Collaborative HIV Seroincidence Study (MacQueen et al. 1996). For this project we needed a codebook that would facilitate coder recognition of discrete events and specific behaviors as well as the ability to distinguish statements of fact (e.g., a partner who was known to be HIV-seropositive) from statements of perception (e.g., a partner who was thought to be uninfected). More recently, we have used the structured codebook format as a key element in data management and analysis for Project LinCS: Linking Communities and Scientists, a multisite study with over 300 transcribed in-depth interviews (MacQueen and Trotter 1997). Thus far, we have developed seven distinct codebooks to guide and structure our use of the extensive Project LinCS data base. In addition, our codebook format has been incorporated into in-house software applications for qualitative data management and analysis such as CDC EZ-Text (Carey et al. 1998). Codebook Structure The codebook structure has evolved to include six basic components: the code, a brief definition, a full definition, guidelines for when to use the code, guidelines for when not to use the code, and examples. Table 1 illustrates how the components were operationalized for a code we developed as part of an analysis of emic representations of "community" (Blanchard et al. 1997). Though the structure is simple and stable, the process of building the codebook is complex and dynamic. We have found that a relational database management program such as Access 1 (Microsoft Corporation) is conceptually well suited for both maintaining the structure and supporting the process. In a relational database the basic unit is the relation, which is generally conceptualized as a table with rows of "observations" (in our case, the codes) and columns of "attributes" (the definitional parameters for the codes). The rows and columns look like a standard quantitative data set, and the format provides all of the flexibility and automated processing typically associated with quantitative data management. However, unlike many database management systems that are specifically designed for quantitative data, the cells in a table or relation in Access can contain unlimited amounts of text. This feature makes it possible to work with the codebook as a qualitative database. There are practical advantages to using a data base management program for codebook development. First, it facilitates a systematic approach by providing a template that prompts for key components. Second, it facilitates the production of multiple versions of the codebook, for example, a simple listing of the codes with brief definitions for quick reference and a full version with all components for detailed reference and comparison of coding guidelines. Third, it is easy to revise the codebook and then print only the modified codes and their definitions (rather than the whole codebook). Fourth, use of a date field makes it easy to generate reports that list coding guidelines that were changed after a particular date. Fifth, the codebook can be output in a wide variety of formats including ASCII text files, spreadsheets, and other database formats. Some of these formats may then be imported directly into qualitative analysis programs such as Tally, NUD*IST (Qualitative Solutions & Research), and ATLAS/ti (Scientific Software Development). Sixth, the fact that the codebook can be detached from other parts of the analytic database means it can easily be modified, copied, and replaced.
3 Last, but not least, a codebook expressed as a relation or table in a relational database can, in turn, be linked to other codebook tables. This allows the analyst to define directional links between sets of codes and add new dimensions to the relationships among codes. For example, hierarchical relationships can be defined such that a single code in one table is linked to a series of related codes in another table (a "one-to-many" relationship). Hierarchical relationships can be used to create families of codes (Muhr 1994) that can be aggregated, reviewed, and analyzed at increasingly general levels. Thus, the scale of analysis can be easily and systematically modified. Alternatively, a set of codes in one table may have multiple links to those in other tables, forming a "many-to-many" relationship. In this case, the code relationships can be summarized as a network. Both hierarchical and network linkages can be portrayed graphically. In our approach, the codebook functions as a frame or boundary that the analyst constructs in order to systematically map the informational terrain of the text. The codebook may or may not reflect "natural" boundaries for the terrain, but it always reflects the analyst s implicit or explicit research questions. It forces the analyst to place his or her assumptions and biases in plain view. The codes, in turn, function like coordinates on the map frame; when applied to the text, they link features in the text (e.g., words, sentences, dialog) to the analyst s constructs. The adequacy of answers to research questions can then be assessed in terms of the sensitivity and specificity of the codes, the richness of the text, and the validity and reliability of the links established among them. From this perspective, the central challenge of systematic qualitative analysis lies within the coding process. The Coding Process As Bernard (1994:193) points out, a code can be used as an encryption, indexing, or measurement device; we focus on using index codes to tag text for retrieval and measurement codes to assign values to text such as the frequency, amount, or presence/absence of information (Bernard 1994; Seidel and Kelle 1995; Bernard and Ryan In press). In both cases, the code adds information to the text (rather than reducing the text) through a process of interpretation that simultaneously breaks the text down into meaningful chunks or segments. Thus, the coding process must include explicit quidelines for defining the boundaries of the text associated with a particular code. How the text is segmented is influenced by the data collection strategy and the way that strategy structures the resulting text. Data collection strategies generally fall along a continuum from minimally structured (e.g., transcribed interviews) to maximally structured (e.g., brief responses to open-ended survey questions). With large databases comprised of in-depth transcribed interviews we find it useful to begin by coding text according to the specific research questions used to frame the interview; we label this type of index coding as Stage 1 or structural coding. The purpose of this step is to facilitate subsequent analysis by identifying all of the text associated with a particular elicitation or research question. The code inclusion criteria can include linguistic cues that signal the coder to apply structural codes to text that is out of sequence with the original interview structure. This is important for situations where respondents spontaneously return to earlier topics or make a cognitive leap to a topic that the interviewer intended to cover later. With regard to segmenting the text for structural coding, we find it useful to include within each segment the full elicitation from the interviewer and the full response from the participant, including all dialog between the interviewer and participant that flows from the elicitation. This preserves both the flow of the interview and the full context of the discussion on a particular topic. Structural coding generally results in the identification of large segments of text on broad topics; these segments can then form the basis for an in-depth analysis within or across topics. We have found that code definitions that incorporate substantial references to professional jargon (i.e., etic codes) tend to encourage the use of coders preconceptions in the analysis, making it difficult to distinguish the text (or "voice") of the respondent from that of the analyst or researcher. In contrast to structural coding, we therefore attempt to use respondents own terms and semantics to guide the construction of codes and their definitions for in-depth analysis (i.e., emic codes). In addition, the use of linguistic cues to assist coders in correctly applying codes to text tends to reduce the potential for misinterpretation and omission of relevant information. For example, in one of our projects we created three socioeconomic class status codes (poor, middle class, and rich) to measure the salience of socioeconomic class for the way individuals perceived the structure of their community. This was based on an etic or "objective" viewpoint. However, these codes did not capture the more subjective viewpoint of
4 the participants, who often described the financial status of others relative to their own, e.g., richer, poorer, "those with more money." The etic codes and definitions also led the coders to make implicit assumptions about education, employment, access to resources, and basic living conditions that were not always supported in the text. We therefore eliminated the etic codes with their implicit assumptions about social conditions and used a single code to capture all references to income levels and socioeconomic class, with additional codes to capture explicit references to homelessness and employment status. Text segmentation during in-depth analysis is less straightforward than during structural coding. A coded segment could be as simple as a text marker or tag placed over a word or phrase, with the boundaries of the segment free floating; in this case the segment can be continuously redefined during analysis to include only the word, a set number of lines above and below the word, or the full text document. Alternatively, rules for bounding segments can be established a priori, e.g., through use of textual units such as sentences or grammatical guidelines such as requiring the inclusion of subject references. With brief responses to open-ended questions on standardized surveys there is generally little need to develop structural codes because the data is prestructured by question and participant. Here the goal is to code the text in such a way that the information can be combined meaningfully with a quantitative database. The codes are used primarily to signal the presence or absence of particular pieces of information. The open-ended responses can then be summarized in a 0/1 matrix where the rows are labeled with participant identification numbers and the columns with the codes; the cells are filled with either a "0" (to indicate that the information was not present) or a "1" (to indicate that it was present). For example, Table 2 presents a matrix for a hypothetical group of responses to a question concerning how persons think they may have been exposed to HIV. It would also be possible to attach values to the codes, for example, the MULTPARTS code could be valued to reflect the total number of partners that a person reported (provided that information is available from the data). In order to facilitate quantitative analysis of open-ended responses it is generally advisable to limit the total number of codes to be used. However, this can easily be done by collapsing code categories together through data transformations performed on the matrix. The final code categories can be built into the coding process through the use of hierarchical codes and code families. Such a strategy is preferable to limiting the content of the codebook a priori only to find that the resulting codes are too crude to permit a meaningful analysis. There are a several software programs available that support the analysis of open-ended structured survey data. CDC EZ-Text (Carey et al. 1998) is specifically designed for text entry and coding of brief open-ended responses; as noted previously, it includes the same codebook structure outlined in this paper. It also includes matrix building capabilities and an export feature that transforms the coded data into a quantitative database that can be imported into a number of statistical programs for further analysis. TextSmart TM (SPSS Inc.) is a new program that uses an automated contentanalysis approach for quantitative analysis of very brief responses (<255 characters). We have not explored the utility of the structured codebook for this approach to coding. Between the examples discussed above (transcribed interviews and open-ended survey questions) lie a range of other possibilities. For example, rather than record and transcribe interviews verbatim, researchers may take notes and then organize their notes according to specific research topics. Although the elicitation is relatively unstructured, the resulting text is highly structured and presegmented by research topic. Another possibility would be the collection of open-ended responses on a single question across multiple waves of structured interviews with the same individuals. The responses from each wave could be compiled into a single textual response for each individual that permits segmentation by recurrent themes as well as by interview wave. The analysis of other types of text such as field notes, diaries, and newsletters each present their own challenges and possibilities. The issues of data collection and text segmentation as elements in the coding process are complicated and deserve further systematic exploration. Refining the Codebook Before coding an entire data set, we systematically evaluate the utility of the codes and the coders ability to apply the codes in a consistent manner. The steps in this process begin with the development of an initial code list derived from
5 etic concepts, emic themes, or both (Figure 1). Usually, one or two team members take a lead role in developing the list, which is then reviewed by the full research team (Figure 1). Once there is agreement on the scope and level of detail for the items in the list, the team leaders begin the process of codebook development. Definitions are proposed and reviewed by the team, with an emphasis on achieving clarity and explicit guidance for code application. When the team is comfortable with the code definitions, two or more coders are then given the task of independently coding the same sample of text. The results of their coding are then compared for consistency of text segmentation and code application. If the results are acceptable and consistent, the coding continues with periodic checks for continued intercoder agreement. If the results are unacceptable and inconsistent, the inconsistencies are reviewed by the coders and team leader(s). The codebook is reviewed to determine whether the inconsistencies are due to coder error, e.g., misunderstanding of terminology or guidelines. We view these as training errors and they are generally handled by the coders and team leader(s). Other inconsistencies are due to problems with the code definitions, e.g., overlapping or ambiguous inclusion criteria that make it difficult to distinguish between two codes. These types of problems are generally discussed by the whole team, as they have implications for the interpretation of the text. Once the problems are identified and the codebook clarified, all previously coded text is reviewed and, if necessary, recoded so that it is consistent with the revised definitions. Intercoder agreement is again checked, to ensure that the new guidelines have resolved the problem. This iterative coding process continues until all text has been satisfactorily coded. There are a variety of ways that intercoder agreement can be assessed. For transcribed interviews, we generally prefer a detailed segment-by-segment review that includes assessments of consistency in defining the beginning and end of segments as well as the application of codes within segments. All inconsistencies are noted, discussed, and resolved. When only a subsample of the text is coded by multiple coders, this strategy may not capture all coding errors. In particular, it is prone toward the loss of pertinent text, that is, text that is not captured by any code. Erroneously captured text, in contrast, is more likely to be spotted during the course of either coding checks or subsequent analysis. As noted previously, we also use kappa to statistically assess the level of intercoder agreement. This approach is useful for identifying coder error, as long as the error is not due to systematic mistraining of all coders. It is also helpful for identifying poorly defined codes, which tend to be applied erratically. Some Practical Suggestions We have learned eight practical lessons from our experience with team-based codebook development and application. 1. Assign primary responsibility to one person for creating, updating, and revising a given codebook. This will ensure that the codebook is consistent, and it will minimize ambiguities due to differences in vocabulary and writing styles. Require all members of the coding team to clear all coding questions and clarifications with the codebook editor. Make certain that the codebook editor has the basic competence for the task. 2. Schedule regular meetings where the coding team reviews each code and definition in the codebook. It is easy for a coder to develop a set of implicit rules without realizing that the codebook no longer reflects his or her actual coding process; in addition, this evolving process may or may not be shared by other members of the coding team. Without regular review sessions, the codebook can quickly become obsolete and useless as an analytic tool. 3. Establish a coding process that consciously seeks to enhance intercoder agreement. One way to do this is to create a codebook that someone can learn to use within a few hours. For the most part, coders can reasonably handle codes at one time. If the codebook contains more than 40 codes, the coding process needs to be done in stages. Separate the codebook into three or four related components or domains and have the coders apply one set of codes to the entire data set before moving onto the next set. 4. Develop a written plan for segmenting text at the same time the codebook is being developed. Segmenting complex
6 text into smaller natural units, such as a line or a sentence, may be helpful. If you are segmenting less than a sentence, it may be better to utilize a word-based content analysis strategy than a codebook strategy. Of course, the word-based strategy can evolve into codebook development. 5. Establish intercoder agreement measures early in the process of codebook development. Decide a priori how much variability you are willing to permit with regard to both segmentation and code application, and how you will assess that variability. Make sure the coding team understands and agrees with the criteria, and that coding will not be considered final until those criteria are met. 6. When defining codes, do not assume that anything is obvious; always state specifically what the code should and should not capture. This includes defining common abbreviations and "short hand" references that may occur in the text. Include all such information in the full definition of the code and explanations for its use. Things that seem obvious to coders at a certain place and time can be totally obscure in another context, even to the same coders. 7. Don t clutter the codebook with deadwood: throw out codes that don t work, and rework definitions for codes that are problematic. Some codes may capture the specificity of a single response rather than general patterns and themes. It is best to eliminate these types of codes from the codebook or expand their usage. A single generic code (e.g., UNIQUE) can be designed to capture all unique responses on a given topic. The codebook should be a distillation, not an historical document. If maintaining a history of the codebook development process is relevant, then develop a separate strategy for that purpose. 8. Finally, accept the fact that text will need to be recoded as the codebook is refined. Recoding should not be viewed as a step back; it is always indicative of forward movement in the analysis. Conclusion The interdisciplinary team-based approach to qualitative analysis at CDC has provided us with both the incentive and the resources to develop explicit guidelines for and documentation of our methods. For coding, this has led to the generation of a basic structure for organizing codebooks that is also flexible enough to meet the needs of a variety of coding situations. When combined with an informed discussion of the coding process, the use of a flexible yet standard structure facilitates coder training. It also enhances the analyst s ability to transfer skills from one project to another, regardless of variability in the particular software tools used. Acknowledgments We thank Gery Ryan, Jim Carey, and Linda Koenig for their helpful suggestions and comments on earlier drafts of this paper. Many thanks to Bob Trotter and the Project LinCS collaborators who patiently field tested and critiqued the methods described here. Notes 1. Use of trade names is for identification only and does not imply endorsement by the Public Health Service or by the U.S. Department of Health and Human Services. References Bernard, H. R Research Methods in Anthropology: Qualitative and Quantitative Approaches, 2nd Ed. Walnut Creek, CA: AltaMira Press. Bernard, H.R. and G. Ryan. In press. Qualitative and Quantitative Methods of Text Analysis. In Handbook of Research Methods in Cultural Anthropology, H. R. Bernard, editor. Walnut Creek, CA: AltaMira Press.
7 Blanchard, L How Do You Define Community? Perspectives of Community Members. Paper presented at the American Anthropological Association Annual Meeting, November 19 23, Washington, DC. Bowyer, J. W Tally: A Text Analysis Tool for the Liberal Arts. Dubuque: William C. Brown Publishers. Carey, J. W., M. Morgan, and M. J. Oxtoby Intercoder Agreement in Analysis of Responses to Open-Ended Interview Questions: Examples from Tuberculosis Research. CAM 8(3):1 5. Carey, J. W., P. H. Wenzel, C. Reilly, J. Sheridan, and J. M. Steinberg CDC EZ-Text: Software for Management and Analysis of Semistructured Qualitative Data Sets. CAM 10(1): Cotton, D. A., R. J. Cabral, S. Semaan, and A. Gielen Prevention of HIV in Women and Infants Projects. Application of the Transtheoretical Model for HIV Prevention in a Facility-Based and a Community-Level Behavioral Intervention Research Study. Manuscript submitted for publication. DiClemente, C. C. and J.O. Prochaska Processes and Stages of Self-Change: Coping and Competence in Smoking Behavior Change. In Coping and Substance Abuse, S. Schiffman and T. A. Wills, eds. Pp New York: Academic Press. MacQueen, K. M., K. L. Kay, B. N. Bartholow, et al The Relationship Between Perceived Exposure to HIV and Actual Risk in a Cohort of Gay and Bisexual Men. Poster presentation at the XI International Conference on AIDS, July 7 12, Vancouver, BC, Canada. MacQueen K. M. and R. T. Trotter Project LinCS: A Multisite Ethnographic Design for Public Health Research. Paper presented at the American Anthropological Association Annual Meeting, November 19 23, Washington DC. Miles, M. B., and A. M. Huberman Qualitative Data Analysis, 2nd ed. Thousand Oaks, CA: Sage Publications. Milstein, B., T. Lockaby, L. Fogarty, A. Cohen, and D. Cotton. In press. Women s Processes of Behavior Change for Condom Use. Journal of Health Psychology. Muhr, T ATLAS/ti Computer Aided Text Interpretation & Theory Building, Release 1.1E. User s Manual, 2nd ed. Berlin: Thomas Muhr. O Connell, D Development of a Change Process Coding System. Ph.D. diss., University of Rhode Island, Kingston, RI. Prochaska, J. O Systems of Psychotherapy: A Transtheoretical Analysis, 2nd ed. Homewood, IL: Dorsey Press. Seidel, J and U. Kelle Different Functions of Coding in the Analysis of Textual Data. In Computer-Aided Qualitative Data Analysis: Theory, Methods, and Practice. Udo Kelle, ed. Pp Thousand Oaks, CA: Sage Publications. Strauss, A. and J. Corbin Basics of Qualitative Research: Grounded Theory Procedures and Techniques. Newbury Park, CA: Sage Publications. Trotter, R.T Review of TALLY 3.0. CAM 5(2):10-12.
What Are Standards Of Rigor For Qualitative Research? Gery W. Ryan RAND Corporation
What Are Standards Of Rigor For Qualitative Research? Gery W. Ryan RAND Corporation In anticipation of spending two days discussing the question above, I think it would be helpful to make explicit some
Office Hours: By appointment: call (773) 508-1076 office or (773) 343-7809 mobile
PUB HLTH 439 (section 20) Spring 2011 Day/Time: Tuesdays, 6:00 9:00 PM Classroom Location: McGaw 2-403 Course Instructor: Madelyn Iris, Ph.D. Adjunct Associate Professor, Department of Psychiatry and Department
A Systems Approach to Qualitative Data Management and Analysis
FIELD MacQueen, METHODS Milstein / QUALITATIVE DATA MANAGEMENT A Systems Approach to Qualitative Data Management and Analysis KATHLEEN M. MACQUEEN Division of HIV/AIDS Prevention Centers for Disease Control
Inter-Rater Agreement in Analysis of Open-Ended Responses: Lessons from a Mixed Methods Study of Principals 1
Inter-Rater Agreement in Analysis of Open-Ended Responses: Lessons from a Mixed Methods Study of Principals 1 Will J. Jordan and Stephanie R. Miller Temple University Objectives The use of open-ended survey
Chapter 17 Qualitative Data Analysis
Chapter 17 Qualitative Data Analysis (Reminder: Don t forget to utilize the concept maps and study questions as you study this and the other chapters.) The purposes of this chapter are to help you to grasp
QUALITATIVE METHODS: DATA ANALYSIS AND VALIDATION
QUALITATIVE METHODS: DATA ANALYSIS AND VALIDATION Nicholas J. Sitko June 21 st 2013 INDABA AGRICULTURAL POLICY RESEARCH INSTITUTE 1 Interactive Model of Research Design Goals Conceptual Framework Research
FOUNDATIONS OF QUALITATIVE DATA ANALYSIS WITH ATLAS.TI. Guitele J. Rahill, Ph.D., LCSW Florida Educational Fund July, 2014
FOUNDATIONS OF QUALITATIVE DATA ANALYSIS WITH ATLAS.TI Guitele J. Rahill, Ph.D., LCSW Florida Educational Fund July, 2014 Learning Objectives At the conclusion of this session, participants will: 1. Discover
Intercoder reliability for qualitative research
Intercoder reliability for qualitative research You win some, but do you lose some as well? TRAIL Research School, October 2012 Authors Niek Mouter, MSc and Diana Vonk Noordegraaf, MSc Faculty of Technology,
GRADE 11 English Language Arts Standards Pacing Guide. 1 st Nine Weeks
1 st Nine Weeks A. Verify meanings of words by the author s use of definition, restatement, example, comparison, contrast and cause and effect. B. Distinguish the relationship of word meanings between
Note Taking and Analyzing Qualitative Data. Roxanne Ezzet-Lofstrom URBP 298A, Fall 2009 San Jose State University
Note Taking and Analyzing Qualitative Data Roxanne Ezzet-Lofstrom URBP 298A, Fall 2009 San Jose State University Overview How to take notes What to do with those notes How to analyze qualitative data Outline
Qualitative Methods: Coding & Data Analysis
Qualitative Methods: Coding & Data Analysis Michele Andrasik, Ph.D. Sarah Frey, MSW Meheret Endeshaw, MPH CFAR SPRC Qualitative Methods Workshop Series Outline for Today 2 1. Data Management 2. Coding
Analysing Qualitative Data
Analysing Qualitative Data Workshop Professor Debra Myhill Philosophical Assumptions It is important to think about the philosophical assumptions that underpin the interpretation of all data. Your ontological
Assessment: Getting Started with Interviewing Dave Baca, PhD University of Arizona ALA Annual 2006 Why Interview? When interpersonal contact is important When you need more in-depth data When you need
Analyzing Qualitative Data
Analyzing Qualitative Data The purpose of coding qualitative data is to take an overwhelmingly large amount of words or artifacts and organize them into manageable chunks. Following the steps in this outline
Gravlee & Wutich, Text Analysis Syllabus 1
Gravlee & Wutich, Text Analysis Syllabus 1 Syllabus for Text Analysis Instructor: Dr. Clarence Gravlee and Dr. Amber Wutich Online Office Hours Gravlee - ; Wutich - Email: [email protected] and [email protected]
Qualitative Data Analysis Week 8 Andre S amuel Samuel
Qualitative Data Analysis Week 8 Andre Samuel Introduction Qualitative research generates a large and cumbersome amount of data Data is usually yg generated from field notes, interview transcripts, focus
Grounded Theory. 1 Introduction... 1. 2 Applications of grounded theory... 1. 3 Outline of the design... 2
Grounded Theory Contents 1 Introduction... 1 2 Applications of grounded theory... 1 3 Outline of the design... 2 4 Strengths and weaknesses of grounded theory... 6 5 References... 6 1 Introduction This
Using Qualitative Methods for Monitoring and Evaluation
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike License. Your use of this material constitutes acceptance of that license and the conditions of use of materials on this
ATLAS.ti: The Qualitative Data Analysis Workbench
ATLAS.ti: The Qualitative Data Analysis Workbench An overview November 22, 2012 Ricardo B. Contreras, PhD Applied cultural anthropologist Director of the ATLAS.ti Training Center Greenville, North Carolina,
Using qualitative research to explore women s responses
Using qualitative research to explore women s responses Towards meaningful assistance - how evidence from qualitative studies can help to meet survivors needs Possible questions Why do survivors of SV
Choosing a CAQDAS Package Using Software for Qualitative Data Analysis : A step by step Guide
A working paper by Ann Lewins & Christina Silver, 6th edition April 2009 CAQDAS Networking Project and Qualitative Innovations in CAQDAS Project. (QUIC) See also the individual software reviews available
Text Mining - Scope and Applications
Journal of Computer Science and Applications. ISSN 2231-1270 Volume 5, Number 2 (2013), pp. 51-55 International Research Publication House http://www.irphouse.com Text Mining - Scope and Applications Miss
Qualitative Research Methods in Psychology - Lab Department of Psychology
Qualitative Research Methods in Psychology - Lab Department of Psychology Lab Objectives: (1) Develop skills in interview transcription for the purposes of qualitative analysis. (2) Develop skills in inductive
How To Write The English Language Learner Can Do Booklet
WORLD-CLASS INSTRUCTIONAL DESIGN AND ASSESSMENT The English Language Learner CAN DO Booklet Grades 9-12 Includes: Performance Definitions CAN DO Descriptors For use in conjunction with the WIDA English
Using the SAS System and QSR NUD*IST for qualitative data analysis (QDA)
Using the SAS System and QSR NUD*IST for qualitative data analysis (QDA) Robert G. Stewart, East Tennessee State University, Johnson City, TN Abstract With the advent of mixed-method inquiry (i.e., the
CAQDAS past, present and future: A comparison of reporting practices in the use of ATLAS.ti and NVivo
CAQDAS past, present and future: A comparison of reporting practices in the use of ATLAS.ti and NVivo Megan Woods Tasmanian School of Business & Economics, University of Tasmania Trena Paulus Educational
The Alignment of Common Core and ACT s College and Career Readiness System. June 2010
The Alignment of Common Core and ACT s College and Career Readiness System June 2010 ACT is an independent, not-for-profit organization that provides assessment, research, information, and program management
USAGE OF NVIVO SOFTWARE FOR QUALITATIVE DATA ANALYSIS
USAGE OF NVIVO SOFTWARE FOR QUALITATIVE DATA ANALYSIS Muhammad Azeem Assessment Expert, PEAS University of Education, Lahore PAKISTAN [email protected] Naseer Ahmad Salfi Doctoral Research Fellow
Research Methods: Qualitative Approach
Research Methods: Qualitative Approach Sharon E. McKenzie, PhD, MS, CTRS, CDP Assistant Professor/Research Scientist Coordinator Gerontology Certificate Program Kean University Dept. of Physical Education,
A general inductive approach for qualitative data analysis
A general inductive approach for qualitative data analysis David R. Thomas School of Population Health University of Auckland, New Zealand Phone +64-9-3737599 Ext 85657 email [email protected] August
Analyzing Research Articles: A Guide for Readers and Writers 1. Sam Mathews, Ph.D. Department of Psychology The University of West Florida
Analyzing Research Articles: A Guide for Readers and Writers 1 Sam Mathews, Ph.D. Department of Psychology The University of West Florida The critical reader of a research report expects the writer to
Determines if the data you collect is practical for analysis. Reviews the appropriateness of your data collection methods.
Performing a Community Assessment 37 STEP 5: DETERMINE HOW TO UNDERSTAND THE INFORMATION (ANALYZE DATA) Now that you have collected data, what does it mean? Making sense of this information is arguably
Qualitative Interview Design: A Practical Guide for Novice Investigators
The Qualitative Report Volume 15 Number 3 May 2010 754-760 http://www.nova.edu/ssss/qr/qr15-3/qid.pdf Qualitative Interview Design: A Practical Guide for Novice Investigators Daniel W. Turner, III Nova
Principles of Qualitative Research: Designing a Qualitative Study
Principles of Qualitative Research: Designing a Qualitative Study John W. Creswell, Ph.D. Vicki L. Plano Clark, M.S. Objectives As a group activity, to plan a qualitative study on the topic of leadership
SSS 955 ADVANCED QUALITATIVE RESEARCH METHODOLOGIES
THE CATHOLIC UNIVERSITY OF AMERICA CUA National Catholic School of Social Service Washington, DC 20064 202-319-5458 Fax 202-319-5093 SSS 955 ADVANCED QUALITATIVE RESEARCH METHODOLOGIES I. COURSE PURPOSE
UNIT 2: CRITICAL THINKING IN GAME DESIGN
UNIT 2: CRITICAL THINKING IN GAME DESIGN UNIT 2: CRITICAL THINKING IN GAME DESIGN 2.A: Unit Overview 2.B: Instructional Resource Guide 2.C: Learning Activities Guide 2.D: Standards Alignment Guide 2.E:
Analyzing Qualitative Data
G3658-12 University of Wisconsin-Extension Cooperative Extension Madison, Wisconsin PD & E Program Development & Evaluation Analyzing Qualitative Data 2003 Ellen Taylor-Powell Marcus Renner Introduction
Reading Instruction and Reading Achievement Among ELL Students
Research Into Practice READING Reading Instruction and Reading Achievement Among ELL Students Principles of ELL Reading Instruction Some very straightforward principles, directly supported by research,
Analyzing survey text: a brief overview
IBM SPSS Text Analytics for Surveys Analyzing survey text: a brief overview Learn how gives you greater insight Contents 1 Introduction 2 The role of text in survey research 2 Approaches to text mining
A Taxonomy for Education and Training in Professional Psychology Health Service Specialties
Commission for the Recognition of Specialties and Proficiencies in Professional Psychology Education AND Training Guidelines A Taxonomy for Education and Training in Professional Psychology Health Service
Intended Use of the document: Teachers who are using standards based reporting in their classrooms.
Standards Based Grading reports a student s ability to demonstrate mastery of a given standard. This Excel spread sheet is designed to support this method of assessing and reporting student learning. Purpose
Role of the Researcher
Analyzing Qualitative Data: With or without software Sharlene Hesse-Biber, Ph.D. Department of Sociology Boston College Chestnut Hill, MA 02467 [email protected] 4/19/10 1 Role of the Researcher YOU are a data
Developing an R Series Plan that Incorporates Mixed Methods Research
Developing an R Series Plan that Incorporates Mixed Methods Research / 16 Developing an R Series Plan that Incorporates Mixed Methods Research primary mechanism for obtaining NIH grants to conduct mixed
A General Inductive Approach for Analyzing Qualitative Evaluation Data. David R. Thomas University of Auckland
10.1177/1098214005283748 American Thomas / Analyzing Journal of Qualitative Evaluation / Evaluation June 2006Data Method Notes This section includes shorter papers (e.g., 10-15 double-spaced manuscript
Undergraduate Psychology Major Learning Goals and Outcomes i
Undergraduate Psychology Major Learning Goals and Outcomes i Goal 1: Knowledge Base of Psychology Demonstrate familiarity with the major concepts, theoretical perspectives, empirical findings, and historical
QUALITATIVE RESEARCH. [Adapted from a presentation by Jan Anderson, University of Teesside, UK]
QUALITATIVE RESEARCH [Adapted from a presentation by Jan Anderson, University of Teesside, UK] QUALITATIVE RESEARCH There have been many debates around what actually constitutes qualitative research whether
Methods in Case Study Analysis
Methods in Case Study Analysis Linda T. Kohn, Ph.D. The Center for Studying Health System Change Technical Publication No. 2 June 1997 Methods in Case Study Analysis Linda T. Kohn, Ph.D. The Center for
USING NVIVO FOR DATA ANALYSIS IN QUALITATIVE RESEARCH AlYahmady Hamed Hilal Saleh Said Alabri Ministry of Education, Sultanate of Oman
USING NVIVO FOR DATA ANALYSIS IN QUALITATIVE RESEARCH AlYahmady Hamed Hilal Saleh Said Alabri Ministry of Education, Sultanate of Oman ABSTRACT _ Qualitative data is characterized by its subjectivity,
Review Protocol Agile Software Development
Review Protocol Agile Software Development Tore Dybå 1. Background The concept of Agile Software Development has sparked a lot of interest in both industry and academia. Advocates of agile methods consider
ATLAS.ti 7 Distinguishing features and functions
ATLAS.ti 7 Distinguishing features and functions This document is intended to be read in conjunction with the Choosing a CAQDAS Package Working Paper which provides a more general commentary of common
Qualitative Research Methods Overview
Qualitative Research Methods: A Data Collector s Field Guide Module 1 Qualitative Research Methods Overview F A M I L Y H E A L T H I N T E R N A T I O N A L Qualitative Research Methods Overview This
4.5 Data analysis. 4.5.1 Transcription
OnlinePLUS supplementary material Adolphsen, Manuel (2014). Communication Strategies of Governments and NGOs: Engineering Global Discourse at High-Level International Summits. Wiesbaden: Springer VS. 4.5
Reviewed by Ok s a n a Afitska, University of Bristol
Vol. 3, No. 2 (December2009), pp. 226-235 http://nflrc.hawaii.edu/ldc/ http://hdl.handle.net/10125/4441 Transana 2.30 from Wisconsin Center for Education Research Reviewed by Ok s a n a Afitska, University
This article is the fourth in a series that aims to assist nurses working. Research and diabetes nursing. Part 4: Qualitative designs.
Research and diabetes nursing. Part 4: Qualitative designs Article points 1. Qualitative research is a broad term that covers a number of diverse approaches which seek to understand by means of exploration,
4. Is the study design appropriate for the research question and objectives?
Guidelines for Critical Review of Qualitative Studies Based on Guidelines for Critical Review Form-Qualitative Studies by Law, M., Stewart, D., Letts, L., Pollock, N., Bosch, J., & Westmorland, M., 1998
How to Conduct Ethnographic Research
The Qualitative Report Volume 16 Number 2 March 2011 567-573 http://www.nova.edu/ssss/qr/qr16-2/sangasubanat.pdf How to Conduct Ethnographic Research Nisaratana Sangasubana Nova Southeastern University,
Using an Instructional Systems Development Model as a Framework for Research on Scale Up 1
Using an Instructional Systems Development Model as a Framework for Research on Scale Up 1 Michael R. Vitale East Carolina University Nancy R. Romance Florida Atlantic University Abstract This paper presents
QDA Miner 3.2 (with WordStat & Simstat) Distinguishing features and functions Christina Silver & Ann Lewins
QDA Miner 3.2 (with WordStat & Simstat) Distinguishing features and functions Christina Silver & Ann Lewins This document is meant to be read in conjunction with the Choosing a CAQDAS Package Working Paper
Blog, blog, blog Online Journaling in Graduate Classes
Blog, blog, blog Online Journaling in Graduate Classes 1 Lyubov Laroche, 2 Barbara Newman Young, 1 Dorothy Valcarcel Craig 1 Western Washington University, 2 Middle Tennessee State University (USA) Abstract
This Module goes from 9/30 (10/2) 10/13 (10/15) MODULE 3 QUALITATIVE RESEARCH TOPIC 1. **Essential Questions**
This Module goes from 9/30 (10/2) 10/13 (10/15) NOTE 1: THERE IS A LOT OF READING OVER THE NEXT TWO WEEKS, BUT I HAVE BROKEN THEM INTO CHUNKS TO MAKE IT MORE DIGESTIBLE. THE TASKS ARE MINIMAL; IT S REALLY
Concept-Mapping Software: How effective is the learning tool in an online learning environment?
Concept-Mapping Software: How effective is the learning tool in an online learning environment? Online learning environments address the educational objectives by putting the learner at the center of the
ANALYSIS OF FOCUS GROUP DATA
ANALYSIS OF FOCUS GROUP DATA Focus Groups generate a large amount of data which needs to be organized and processed so that the main ideas are elicited. The first step is transcribing the FGs in a way
Qualitative methods for effectiveness evaluation: When numbers are not enough
Chapter 7 Qualitative methods for effectiveness evaluation: When numbers are not enough 7.1 Introduction 7.2 Methods of collecting qualitative information 7.2.1 Interviews and focus groups 7.2.2 Questionnaires
Appendix 1: Methodology Interviews. A1 Sampling Strategy
Appendix 1: Methodology Interviews A1 Sampling Strategy Creswell noted that in qualitative research, the intent is not to generalize to a population, but to develop an in-depth exploration of a central
NEW YORK STATE TEACHER CERTIFICATION EXAMINATIONS
NEW YORK STATE TEACHER CERTIFICATION EXAMINATIONS TEST DESIGN AND FRAMEWORK DRAFT June 2012 This document is a working draft. The information in this document is subject to change, and any changes will
APPENDIX B: COMPUTER SOFTWARE TO COMPILE AND ANALYZE DATA
Appendix B: Computer Software to Compile and Analyze Data B-63 APPENDIX B: COMPUTER SOFTWARE TO COMPILE AND ANALYZE DATA B-64 Appendix B: Computer Software to Compile and Analyze Data APPENDIX B: COMPUTER
Using Common Software Programs for Qualitative Data Analysis (QDA) Jean N. Scandlyn University of Colorado Denver Health and Behavioral Sciences
Using Common Software Programs for Qualitative Data Analysis (QDA) Jean N. Scandlyn University of Colorado Denver Health and Behavioral Sciences Topics Data display Tagging and retrieving text Coding Sorting
How To Be A Successful Writer
S WORKING DRAFT FOR PILOT ACROSS GRADUATE PROGRAMS Approved by GASCC; Revised by the Assessment Council, Spring 2013 April 2 nd, 2013 Notre Dame de Namur University Note: Most rubrics adapted from AAC&U
Preparing and Refining a Research Proposal: Tips for Inexperienced Scientists
Preparing and Refining a Research Proposal: Tips for Inexperienced Scientists Topics Proposal format Strengthening elements Internal continuity Funding targets Know your donor Grantee ethics Never too
Depth-of-Knowledge Levels for Four Content Areas Norman L. Webb March 28, 2002. Reading (based on Wixson, 1999)
Depth-of-Knowledge Levels for Four Content Areas Norman L. Webb March 28, 2002 Language Arts Levels of Depth of Knowledge Interpreting and assigning depth-of-knowledge levels to both objectives within
Human resources development and training
International Labour Conference 92nd Session 2004 Report IV (1) Human resources development and training Fourth item on the agenda International Labour Office Geneva ISBN 92-2-113036-3 ISSN 0074-6681 First
14. Qualitative Field Research
196 Qualitative Field Research 14. Qualitative Field Research Qualitative research is the method of choice when the research question requires an understanding of processes, events and relationships in
Guidelines for Master of Public Health Master's Essay
Guidelines for Master of Public Health Master's Essay Department of Public Health The University of Tennessee 1914 Andy Holt Avenue (390 HPER) Knoxville, Tennessee 37996-2710 (865) 974-5041 http://publichealth.utk.edu
CREDIT TRANSFER: GUIDELINES FOR STUDENT TRANSFER AND ARTICULATION AMONG MISSOURI COLLEGES AND UNIVERSITIES
CREDIT TRANSFER: GUIDELINES FOR STUDENT TRANSFER AND ARTICULATION AMONG MISSOURI COLLEGES AND UNIVERSITIES With Revisions as Proposed by the General Education Steering Committee [Extracts] A. RATIONALE
PENNSYLVANIA COMMON CORE STANDARDS English Language Arts Grades 9-12
1.2 Reading Informational Text Students read, understand, and respond to informational text with emphasis on comprehension, making connections among ideas and between texts with focus on textual evidence.
Usability Evaluation with Users CMPT 281
Usability Evaluation with Users CMPT 281 Outline Usability review Observational methods Interview methods Questionnaire methods Usability ISO 9241-11: The extent to which a product can be used by specified
Appraising qualitative research articles in medicine and medical education
Medical Teacher, Vol. 27, No. 1, 2005, pp. 71 75 Appraising qualitative research articles in medicine and medical education LUC CÔTÉ & JEAN TURGEON Department of Family Medicine, Faculty of Medicine, Laval
Information Technology and Knowledge Management
Information Technology and Knowledge Management E. Shimemura and Y. Nakamori Japan Advanced Institute of Science and Technology 1-1 Asahidai, Tatsunokuchi, Ishikawa 923-1292, Japan Abstract This paper
Using Public Health Evaluation Models to Assess Health IT Implementations
Using Public Health Evaluation Models to Assess Health IT Implementations September 2011 White Paper Prepared for Healthcare Information and Management Systems Society 33 West Monroe Street Suite 1700
Framework Analysis: A Qualitative Methodology for Applied Policy Research. Aashish Srivastava 1. S. Bruce Thomson 2
Framework Analysis: A Qualitative Methodology for Applied Policy Research Aashish Srivastava 1 S. Bruce Thomson 2 Abstract Policies and procedures govern organizations whether they are private or public,
On the attributes of a critical literature review. Saunders, Mark N. K. 1 & Rojon, Céline 2. United Kingdom.
On the attributes of a critical literature review Saunders, Mark N. K. 1 & Rojon, Céline 2 1 School of Management, University of Surrey, Guildford, GU2 7XH, United Kingdom; 2 Department of Psychology &
Understanding News Researchers through a Content Analysis of Dissertations and Theses
Qualitative and Quantitative Methods in Libraries (QQML) 1:263 270, 2014 Understanding News Researchers through a Content Analysis of Dissertations and Theses Mary Feeney Associate Librarian, Research
ISRE 2400 (Revised), Engagements to Review Historical Financial Statements
International Auditing and Assurance Standards Board Exposure Draft January 2011 Comments requested by May 20, 2011 Proposed International Standard on Review Engagements ISRE 2400 (Revised), Engagements
Data Analysis and Statistical Software Workshop. Ted Kasha, B.S. Kimberly Galt, Pharm.D., Ph.D.(c) May 14, 2009
Data Analysis and Statistical Software Workshop Ted Kasha, B.S. Kimberly Galt, Pharm.D., Ph.D.(c) May 14, 2009 Learning Objectives: Data analysis commonly used today Available data analysis software packages
Interview studies. 1 Introduction... 1. 2 Applications of interview study designs... 2. 3 Outline of the design... 3
Interview studies Contents 1 Introduction... 1 2 Applications of interview study designs... 2 3 Outline of the design... 3 4 Strengths and weaknesses of interview study designs... 6 5 References... 7 1
SOME IMPORTANT NOTES ON QUALITATIVE RESEARCH. Prof. Dr. Andreas Budihardjo August, 2013
SOME IMPORTANT NOTES ON QUALITATIVE RESEARCH Prof. Dr. Andreas Budihardjo August, 2013 CHECKLIST ( LOFLAND & LOFLAND, 1995) Basic organization How well is the article written or presented? Data & Methods
ESL I English as a Second Language I Curriculum
ESL I English as a Second Language I Curriculum ESL Curriculum alignment with NJ English Language Proficiency Standards (Incorporating NJCCCS and WIDA Standards) Revised November, 2011 The ESL program
College of Arts and Sciences: Social Science and Humanities Outcomes
College of Arts and Sciences: Social Science and Humanities Outcomes Communication Information Mgt/ Quantitative Skills Valuing/Ethics/ Integrity Critical Thinking Content Knowledge Application/ Internship
Improving Traceability of Requirements Through Qualitative Data Analysis
Improving Traceability of Requirements Through Qualitative Data Analysis Andreas Kaufmann, Dirk Riehle Open Source Research Group, Computer Science Department Friedrich-Alexander University Erlangen Nürnberg
Evaluating Web Site Structure A Set of Techniques
Introduction Evaluating Web Site Structure A Set of Techniques K. Frederickson-Mele, Michael D. Levi, and Frederick G. Conrad U.S. Department of Labor, Bureau of Labor Statistics Washington, DC As the
