3. Outline
• Definitions of Research/Evaluation
• Purposes of Evaluation / Research
• Science and Scientific Management
• Research/Evaluation as Process
• Types of Research / Evaluation
4. Definitions
• Evaluation = Process of judging the merit or
worth of something
• Research
– application of scientific methods to answer
questions
– controlled inquiry directed at increasing
knowledge/establishing truth
• Evaluation Research - combine the two
5. Science
• Body of Knowledge • Method of Inquiry
• systematic • logical
• abstract – induction
• – deduction
general
• parsimonious • self-corrective
• empirical
6. Scientific Management
• Application of scientific principles to
management and decision making
– systematic information gathering
– empirical, objective, self-corrective
7. Process -- Steps
Research Evaluation
• define problem • describe program
• objectives/hypotheses • evaluation criteria
• literature review • program scoping
• research methods • evaluation methods
• gather data/analysis • gather data/analysis
• conclusions • conclusions
8. Types of Evaluation
by Program Stage
• formative (conceptualization/design)]
• process (implementation)
• summative (outcomes, impacts, efficiency)
9. Types - By Approach
• Standards
– norm-based
– criterion-referenced
• Goals and objectives
• Impacts or effects
10. Evaluation Criteria
• Effort - qnty and qlty of inputs
• Performance - qnty and qlty of outputs
• Adequacy - meet needs?
• Efficiency - benefits/costs
• Equity - distributional issues, fairness
11. Process Evaluation
• Identifies how and why program works
– attributes
– recipients
– conditions
– effects
• single or multiple
• intended or side effects
• timing & duration, long/short term
• cognitive, affective or behavioral
12. Research Process
Define Problem, Research Objectives
HOW?
What? Overall Method
Who?
•Concepts •Survey
•Experiment •Population
•Variables •Case Study
•Sampling
•Secondary Data
•Measures
Data Gathering
Analysis
Application
13. Proposal Format
1. Problem Statement - define program to be evaluated/problem to be studied,
users & uses of results. Justify importance of the problem/study.
2. Objectives : Concise listing . In evaluation studies, the objectives
usually focus on the key elements of program to be evaluated & the evaluation
criteria. These are the study objectives NOT the program objectives.
3. Background/Literature Review - place for more extensive
history/structure of program. Focus on aspects most relevant to proposed
evaluation. Discuss previous studies or the relevant methods.
4. Methods - details on procedures for achieving objectives - data gathering and
analysis, population, sampling, measures, etc. Who will do what to whom, when,
where, how and why?
5. Attachments - budget, timeline, measurement instruments, etc.
NOTE: Most “programs” must be narrowed to specific components to be evaluated.
Think of a “Program of studies” rather than a single evaluation study. The proposal
should define this specific study & how it fits into a broader program of studies.
14. Purposes of Proposal
• Communicate with Client
• Demonstrate your grasp of problem
• Plan the study in advance, so others can
evaluate the study approach
– will it work?
– have you overlooked something?
– will results be useful to client?
– Can we afford it?
15. Sample Objectives
1. Estimate benefits and costs of program
2. Estimate economic impacts of program on local
community (social, environmental, fiscal).
3. Determine effects of program on target
population.
4. Describe users and non-users of program
5. Assess community recreation needs, preferences
6. Determine market/financial feasibility of
program
7. Evaluate adequacy or performance of program
16. Methods Choices
• Overall Approach/Design
– Qualitative or Quantitative
– Primary or secondary data
– Survey, experiment, case study, etc.
• Who to study - population, sample
– individuals, market segments, populations
• What to study - concepts, measures
– behavior, knowledge, attitudes
• Cost vs Benefit of Study
17. Definition & Measurement
“measurement is the beginning of science, … until you
can measure something, your knowledge is meager and
unsatisfactory” Lord Kelvin
Nominal/Conceptual Definition - define concept
in terms of other concepts, links concepts
without tying them to real world
Operational definition - equates definition with
measurement, specify procedures/operations
to generate the concept.
18. Levels of Measurement
Level Characteristic Example
Nominal Unordered Race, gender
categories
Ordinal Ordered categories Sm, med.lg
Hardness scale
Interval Consistent distance Temp in fahrenheit
between categories or Celsius
Ratio Natural zero Temp in Kelvin
20. Questionnaire Design
1. Preliminary Info
Information needed
Who are subjects
Method of communication
2. Question Content
3. Question Wording
4. Response Format
5. Question Sequencing/Layout
22. Sampling
• Always define study population first
• Use element/unit/extent/time for complete
definition
• element - who is interviewed
• sampling unit - basic unit containing elements
• extent - limit population (often spatially)
• time - fix population in time
23. Types of Sampling Approaches
• Probability vs non-Probability
• Judgment, Simple Random, Systematic
• Stratify or Cluster (Area Sample)
• Time Sampling
24. Sample size
• Based on four factors
• Cost/budget
• Accuracy desired
• variance in popln on variable of interest
• subgroup analysis planned
• Formula: n= Z2 σ2 / e 2
• n= sample size
• Z indicates confidence level (95% = 1.96)
∀ σ = standard deviation of variable in population
• e = sampling error
26. Computing 95% confidence interval
• N= 100 , sample mean = 46%, use p= 50/50,
• sampling error from table = 10%
• 95% CI is 46% + or - 10% = (36, 56)
• N=1,000 sample mean =22%
• sampling error from table = 2.5%
• 95% CI is 22% + or - 2.5% = (19.5, 24.5)
27. Research Designs/Data Collection Approaches
How ....Where Household On-Site Laboratory
Gathered
Personal Surveys Surveys, Focus Groups
Interview Field Expmts
Telephone/ Surveys Computer Computer
Computer Interviews Interviews
Self-Admin. Surveys, Experiments
Quest. Field Expmts
Observation NA Observable Observable
& Traces Characteristics Characteristics
Secondary NA Internal NA
Sources Records
28. Major Design Types
• Surveys
• Experiments
• Observation
• Secondary Data
• Qualitative Approaches
– Focus Group
– Case Study
29. General Guidelines on when to
use different approaches
1. Describing a population - surveys
2. Describing users/visitors - on-site survey
3. Describing non-users, potential users or
general population - household survey
4. Describing observable characteristics of
visitors - on-site observation
5. Measuring impacts, cause-effect relationships -
experiments
30. Guidelines (cont)
6. Anytime suitable secondary data exists -
secondary data
7. Short, simple household studies - phone
8. Captive audience or very interested population
- self-administered survey
9. Testing new ideas - experimentation or focus
groups
10. In-depth study - in-depth personal interviews,
focus groups, case studies
31. Primary or Secondary Data
• Secondary data are data that were
collected for some purpose other than
your study, e.g. government records, internal documents,
previous surveys
• Choice between Primary /Secondary
Data
– Costs (time, money, personnel)
– Relevance, accuracy, adequacy of data
32. Qualitative vs Quantitative
Quantitative Qualitative
Purpose Gen’l Laws Unique/Individual case
Test Hypotheses Understanding
Predict behavior Meanings/Intentions
Perspective Outsider-Objective Insider-Subjective
Procedures Structured Unstructured
formal measures open ended measures
probability samples judgement samples
statistical analysis interpretation of data
33. Qualitative vs Quantitative Approaches
Qualitative
Focus Group
In-Depth Interview
Case Study
Participant observation
Secondary data analysis
Quantitative
Surveys
Experiments
Structured observation
Secondary data analysis
34. Survey vs Experiment
Survey - measure things as they are, snapshot
of population at one point in time, generally
refers to questionnaires
(telephone, self-administered, personal interview)
Experiment - manipulate at least one variable
(treatment) to evaluate response, to study
cause-effect relationships
(field and lab experiments)
35. STEPS IN A SURVEY
1. Define problem and study objectives
2. Identify information needs & study population(s)
3. Determine basic design/approach
- cross sectional vs longitudinal
- on-site vs household vs other
- self-admin. vs personal interview vs phone
- structured or unstructured questions
4. Questionnaire design
5. Choose sample (frame, size, sampling design)
6. Estimate time, costs, manpower needs, etc.
36. Survey Implementation
7. Proposal & “Human subjects” review
8. Line up necessary resources
9. Pre-test instruments and field procedures
10. Data gathering and follow-up procedures
11. Coding, cleaning and data processing
12. Analysis: preliminary, then final.
13. Communication and presentation of results.
37. Characteristics of a true Experiment
1. Sample equivalent experimental and
control groups
2. Isolate and control the treatment
3. Measure the effect
38. Pre-test/Post-test with Control
R MB1 X MA1 Experimental group
R MB2 MA2 Control group
R denotes random assignment to groups
X denotes the treatment
Measure of effect = ∆ Expmt gp - ∆ Control gp
= (MA1-MB1) - (MA2-MB2)
= with vs without
39. Threats to Internal validity
• * Pre-measurement (Testing) : effect of pre-
measurement on dependent variable (post-test)
• * Selection: nonequivalent experimental & control groups,
(statistical regression a special case)
• * History: impact of any other events between pre- and
post measures on dependent variable
• * Interaction: alteration of the “effect” due to interaction
between treatment & pre-test.
• Maturation: aging of subjects or measurement procedures
• Instrumentation: changes in instruments between pre
and post.
• Mortality: loss of some subjects
45. Three Audiences/styles
• Researchers – research journal style
– Technical, methods, statistical tests
• Managers – business style
– Results and implications
• Public – newspaper style
– Interesting, no jargon, highlights
46. Research vs Business Reports
• Written/Research • Oral/Business
– Problem – Objectives
– Objectives – Key Results &
– Methods Recommendations
– Results – Justify from study
– Discussion – Brief methods
– Discussion
47. Reminders
• Final Exam is Friday Dec 15, 7:45-9:45 am,
this room
• Final Papers due by Wednesday Dec 13
• See YaYen Sun to finish lab work by end of
week.