Research design. The difference between method and methodologies

Similar documents
Comparison of Research Designs Template

Evaluation: Designs and Approaches

Guided Reading 9 th Edition. informed consent, protection from harm, deception, confidentiality, and anonymity.

How do we know what we know?

Observing and describing the behavior of a subject without influencing it in any way.

Overview. Triplett (1898) Social Influence - 1. PSYCHOLOGY 305 / 305G Social Psychology. Research in Social Psychology 2005

12/30/2012. Research Design. Quantitative Research: Types (Campbell & Stanley, 1963; Crowl, 1993)

DOCTOR OF PHILOSOPHY DEGREE. Educational Leadership Doctor of Philosophy Degree Major Course Requirements. EDU721 (3.

COM 365: INTRODUCTION TO COMMUNICATION RESEARCH METHODS Unit Test 3 Study Guide

Introduction Qualitative Data Collection Methods... 7 In depth interviews... 7 Observation methods... 8 Document review... 8 Focus groups...

MARKETING RESEARCH AND MARKET INTELLIGENCE (MRM711S) FEEDBACK TUTORIAL LETTER SEMESTER `1 OF Dear Student

Measurement and measures. Professor Brian Oldenburg

Social Survey Methods and Data Collection

Welcome back to EDFR I m Jeff Oescher, and I ll be discussing quantitative research design with you for the next several lessons.

Quantitative Research: Reliability and Validity

CHAPTER V- METHODOLOGICAL FRAMEWORK

A social marketing approach to behaviour change

The Comparison between. Qualitative, Quantitative, and Single subject design. A Paper presented by. Dr. Bander N. Alotaibi

Research Methods & Experimental Design

Stages of Instructional Design V. Professional Development

Non-random/non-probability sampling designs in quantitative research

ROLES AND RESPONSIBILITIES The roles and responsibilities expected of teachers at each classification level are specified in the Victorian Government

Maths Mastery in Primary Schools

EVALUATION METHODS TIP SHEET

Research design and methods Part II. Dr Brian van Wyk POST-GRADUATE ENROLMENT AND THROUGHPUT

Pre-experimental Designs for Description. Y520 Strategies for Educational Inquiry

DATA ANALYSIS, INTERPRETATION AND PRESENTATION

Qualitative Critique: Missed Nursing Care. Kalisch, B. (2006). Missed Nursing Care A Qualitative Study. J Nurs Care Qual, 21(4),

2 Business, Performance, and Gap Analysis

FOREIGN AFFAIRS PROGRAM EVALUATION GLOSSARY CORE TERMS

Chapter 2 Quantitative, Qualitative, and Mixed Research

Brain U Learning & Scientific Reasoning Keisha Varma. Summer 2011

QUALITATIVE RESEARCH

Chapter 1: The Nature of Probability and Statistics

Experimental methods. Elisabeth Ahlsén Linguistic Methods Course

Holly. Anubhav. Patrick

STUDENT THESIS PROPOSAL GUIDELINES

DESCRIPTIVE RESEARCH DESIGNS

Developing an implementation research proposal. Session 2: Research design

Sampling. COUN 695 Experimental Design

Article Four Different Types of Evidence / Literature Reviews

Exploratory Research Design. Primary vs. Secondary data. Advantages and uses of SD

Interview studies. 1 Introduction Applications of interview study designs Outline of the design... 3

SAMPLING & INFERENTIAL STATISTICS. Sampling is necessary to make inferences about a population.

Survey Research. Classifying surveys on the basis of their scope and their focus gives four categories:

TOOL D14 Monitoring and evaluation: a framework

How to Identify Real Needs WHAT A COMMUNITY NEEDS ASSESSMENT CAN DO FOR YOU:

Research Design and Research Methods

Appendix A: Annotated Table of Activities & Tools

This Module goes from 9/30 (10/2) 10/13 (10/15) MODULE 3 QUALITATIVE RESEARCH TOPIC 1. **Essential Questions**

Evaluating Survey Questions

INTERNATIONAL FRAMEWORK FOR ASSURANCE ENGAGEMENTS CONTENTS

Single and Multiple-Case Study Designs IS493

Creating a Customer Advisory Board Overview and Checklist by Clearworks

SURVEY RESEARCH RESEARCH METHODOLOGY CLASS. Lecturer : RIRI SATRIA Date : November 10, 2009

Fairfield Public Schools

EVALUATION OF IMPORTANCE FOR RESEARCH IN EDUCATION

How To Collect Data From A Large Group

DEVELOPING HYPOTHESIS AND

Descriptive Methods Ch. 6 and 7

User research for information architecture projects

WHAT WORKS IN INNOVATION AND EDUCATION IMPROVING TEACHING AND LEARNING FOR ADULTS WITH BASIC SKILL NEEDS THROUGH FORMATIVE ASSESSMENT STUDY OUTLINE

RESEARCH METHODS IN I/O PSYCHOLOGY

SHORT ANSWER. Write the word or phrase that best completes each statement or answers the question.

Exploratory Research & Beyond

High Schools That Work: How Improving High Schools Can Use Data to Guide Their Progress

education research USING STATISTICS EFFECTIVELY in mathematics The American Statistical Association

Analyzing Research Articles: A Guide for Readers and Writers 1. Sam Mathews, Ph.D. Department of Psychology The University of West Florida

Survey Research: Choice of Instrument, Sample. Lynda Burton, ScD Johns Hopkins University

Research and Digital Game- based Play: A Review of Martha Madison

Policy statement: Assessment, recording and reporting achievement.

Conducting your own qualitative audience research

The Public Policy Process W E E K 1 2 : T H E S C I E N C E O F T H E P O L I C Y P R O C E S S

Running surveys and consultations

Assessment Policy. 1 Introduction. 2 Background

Promoting Strategic STEM Education Outreach Programming Using a Systems-based STEM-EO Model

Software Development Methodologies in Industry. By: Ahmad Deeb

NON-PROBABILITY SAMPLING TECHNIQUES

Interviews and Focus Groups in Advertising, Public relations and Media

Secondary Data Analysis: A Method of which the Time Has Come

Qualitative Research.

OBSERVATION (FIELD RESEARCH)

How to Prepare a Research Protocol for WHO? Shyam Thapa, PhD Scientist

Introducing Social Psychology

Issues in Information Systems Volume 13, Issue 1, pp , 2012

Goals AND Objectives should be student-centered rather than course-centered Goals AND Objectives should reflect successful student performance

Elementary Statistics

Using Rounds to Enhance Teacher Interaction and Self Reflection: The Marzano Observational Protocol

National assessment of foreign languages in Sweden

GLOSSARY OF EVALUATION TERMS

Financial Planner Competency Profile

Zainab Zahran The University of Sheffield School of Nursing and Midwifery

Sampling: What is it? Quantitative Research Methods ENGL 5377 Spring 2007

Competencies for Canadian Evaluation Practice

Case Studies. Dewayne E Perry ENS 623 perry@mail.utexas.edu

QUALITY IN EVERYDAY WORK. Quality Guide for the Teacher Education College Version 2.7

Data collection techniques a guide for researchers in humanities and education

IPDET Module 6: Descriptive, Normative, and Impact Evaluation Designs

MIDDLE STATES ASSOCIATION OF COLLEGES AND SCHOOLS COMMISSIONS ON ELEMENTARY AND SECONDARY SCHOOLS AN INTRODUCTION TO THE VISUAL ARTS

SJSU Annual Program Assessment Form Academic Year

Transcription:

Research design The difference between method and methodologies Method: Technique for gathering evidence The various ways of proceeding in gathering information Methodologies: The underlying theory and analysis of how research does or should proceed, often influenced by discipline ( Sandra Harding )

Empirical research method There are two major groups: Descriptive (Qualitative) Method Ethnography Case study Survey / Sampling Focus group Discourse / text analysis Prediction / classification Experimental (Quantitative) Method True experiment Quasi-experiment Meta-analysis

Ethnographies + Observational field work done in the actual context being studied + Focus on how individuals interrelate in their own environment (and the influence of this environment) - Difficult to interpret/analyze - Time consuming/expensive - Can influence subject behavior

Case studies + Focus is on individual or small group + Able to conduct a comprehensive analysis from a comparison of cases + Allows for identification of variables or phenomenon to be studied - Time consuming - Depth rather than breadth - Not necessarily representative

Survey research + An efficient means of gathering large amounts of data + Can be anonymous and inexpensive - Feedback often incomplete - Wording of instrument can bias feedback - Details often left out

Focus group + Aid in understanding audience, group, users + Small group interaction more than individual response + Helps identify and fill gaps in current knowledge (perceptions, attitudes, feelings, etc.) - Does not give statistics - Marketing tools seen as suspect - Analysis subjective

Discourse / text analysis + Examines actual discourse produced for a particular purpose (job, school) + Helps in understanding of context, production, audience, and text - Schedule for analysis not demanding - Labor intensive

Prediction / classification studies + Goal is to predict behaviors: Prediction forecasts and interval variable + Classification forecasts a nominal variable + Important in industry, education to predict behaviors - Need substantial population - Restricted range of variables can cause misinterpretation - Variables cannot be added together; must be weighted and looked at in context of other variables

Advantages of qualitative research Naturalistic; allows for subjects to interact with environment Can use statistical analysis Seeks to further develop theory (not to influence action); Prescientific Coding schemes often arise from interplay between data and researcher s knowledge of theory

Disadvantages of qualitative research Impossible to overlay structure Impossible to impose control Subject pool often limited, not representative Seen as more subjective, less rigorous Beneficial only in terms of initial investigation to form hypothesis

True experiment True experiment + Random sampling, or selection, of subjects (which are also stratified) + Introduction of a treatment + Use of a control group for comparing subjects who don t receive treatment with those who do + Adherence to scientific method - Must have both internal and external validity - Treatment and control might seem artificial

Quasi-experiment + Similar to Experiment, except that the subjects are not randomized. Intact groups are often used (for example, students in a classroom). + To draw more fully on the power of the experimental method, a pretest may be employed. + Employ treatment, control, and scientific method - Act of control and treatment makes situation artificial - Small subject pools

Meta-analysis + Takes the results of true and quasi-experiments and identifies interrelationships of conclusions + Systematic + Replicable + Summarizes overall results - Comparison of apples and oranges? - Quality of studies used?

Advantages of experimental research Tests the validity of generalizations Seen as rigorous Identifies a cause-and-effect relationship Seen as more objective, less subjective Can be predictive

Disadvantages of true experiment Generalizations need to be qualified according to limitation of research methods employed Controlled settings don t mirror actual conditions; unnatural Difficult to isolate a single variable Doesn t allow for self-reflection

What makes research good? Validity Reliability Replicability Consistent application/analysis Trustworthiness Rigor

Validity in research Refers to whether the research actually measures what it says it will measure. Validity is the strength of our conclusions, inferences or propositions. Internal Validity: The difference in the dependent variable is actually a result of the independent variable External Validity: The results of the study are generalizable to other groups and environments outside the experimental setting Conclusion Validity: We can identify a relationship between treatment and observed outcome Construct Validity: We can generalize our conceptualized treatment and outcomes to broader constructs of the same concepts

Reliability in research The consistency of a measurement, or the degree to which an instrument measures the same way each time it is used under the same condition with the same subjects. In short, it is the repeatability of your measurement. A measure is considered reliable if a person's score on the same test given twice is similar. It is important to remember that reliability is not measured, it is estimated. Measured by test / retest and internal consistency.

Validity and reliability The relationship between reliability and validity is a fairly simple one to understand: a measurement can be reliable, but not valid. However, a measurement must first be reliable before it can be valid. Thus reliability is a necessary, but not sufficient, condition of validity. In other words, a measurement may consistently assess a phenomena (or outcome), but unless that measurement tests what you want it to, it is not valid.

Rigor in research Validity and Reliability in conducting research Adequate presentation of findings: Consistency, trustworthiness Appropriate representation of study for a particular field: Disciplinary rigor Rhetorical Rigor: How you represent your research for a particular audience

Key consideration to design your research approach What question do you want to answer? For what purposes is the research being done? i.e., what do you want to be able to do or decide as a result of the research? Who are the audiences for the information from the research, e.g., teachers, students, other researchers, members of a disciplinary community, corporate entities, etc.? From what sources should the information be collected, e.g., students, teachers, targeted groups, certain documentation, etc.?

How can that information be collected in a reasonable fashion, e.g., questionnaires, interviews, examining documentation, observing staff and/or clients in the program, conducting focus groups among staff and/or students, etc? How accurate will this information be? When is the information needed (so, by when must it be collected)? What resources are available to collect the information? How will this information be analyzed?

The importance of methods and methodologies The most common error made in reading [and conducting] research is overlooking the methodology, and concentrating on the conclusions. Yet if the methodology isn t sound, the conclusions and subsequent recommendations won t be sound. Patricia Goubil-Gambrell