A.
Thinking about evaluation
Evaluation as a professional field has
only emerged in the second half of the 20th Century. What is
evaluation? What is its history? What is its role in society?
What's its connection with research? What conceptual models does
it use? What practical tools and strategies does it use?
This is list of key readings to
give staring points for thinking about these questions.
Evaluation
history and context
Evaluation
as a professional field has only emerged in the second half of
the 20th Century. In understanding evaluation today it is useful
to have an overview of the history of evaluation; some of the
ways it can be described and issues shaping it at present.
1. Ray Pawson & Nick
Tilley (1997). "A history of Evaluation in 28 1/2 pages" in Realistic
Evaluation. Sage. (29pp)
2. Mansoor
A.F. Kazi (2003). "Contemporary perspectives in practice
evaluation" Chapter 2 in Realist Evaluation in practice,
Health and Social Work. Sage. (12pp)
3. Peter Dahler-Larsen
(2006) Chapter 6 "Evaluation after Disenchantment? Five
issues shaping the role of evaluation in society" in the Sage
Handbook of Evaluation ed by Ian Shaw, Jennifer Greene and
Melvin Mark. Sage. (19pp)
Evaluation
There are many
views about what evaluation is. Here are two useful starting
points.
4. E. Jane
Davidson "What is Evaluation" Chapter 1 in Evaluation
Methodology Basics The Nuts and Bolts of Sound Evaluation, Sage
2005 (12pp)
5. Randy Stoecer
(2005) "Evaluation" chapter 7 in Research Methods
for Community Change. Sage (29pp)
Research,
evaluation and policy
Research and
evaluation are connected with policy. There are many ways of
thinking about this connection. The following article describes
several models for this relationship.
6. Brendan
Gibson (2003). Beyond 'Two communities', Chapter 2 in Evidence-based
Health Policy Problems and Possibilities edited by Vivian
Lin and Brendan Gibson. Oxford University Press (13pp)
Values
Evaluation
is about valuing. There are no value free evaluations.
7. E. Jane
Davidson "Values in evaluation" Chapter 6 in Evaluation
Methodology Basics The Nuts and Bolts of Sound Evaluation, Sage
2005 (13pp)
How
we know what isn't so
Some of the
errors we make because of the nature of our human thinking include:
We jump to conclusions - we see more than is there; we are good
at creating order out of random data; We see what we expect to
see - especially when the evidence is ambiguous; and We see what
we want to see - our motivations affect our inferences.
8. Thomas Gilovich
(1991). "Seeing what we expect to see, The biased evaluation
of ambiguous and inconsistent data, Chapter 4 in How we know
what isn't so, The fallibility of human reasons in everyday life.
The Free Press. (24pp)
9. Thomas Gilovich
(1991). "Seeing what we want to see, Motivational determinants
of belief, Chapter 5 in How we know what isn't so, The fallibility
of human reasons in everyday life. The Free Press. (24pp)
Organisations
Human services
are typically provided through organisations. When evaluating
human services one needs ways of conceptualising organisations
- how do we think about organisations; one also needs ways of
identifying what makes a good organisation - what are the characteristics
of a thriving organisation?
10. Lee G.
Bolman and Terrence E. Deal (2003) "Introduction, The Power
of Reframing", Chapter 1 in Reframing Organisations,
Artistry, Choice and Leadership, Third Edition, Jossey-Bass
(17pp)
11. Paul Bullen Ways
of seeing organisations
12. Paul Bullen Characteristics
of thriving organisations
Outcomes
hierarchies and theories of action
A key challenge
in evaluating human service programs is making explicit what
the program is intended to do and how it works. This is commonly
referred to as a theory of action and/or an outcomes hierarchy
13. Michael
Quinn Patton "Conceptualizing
the Intervention", Chapter 10 Utilization-Focused Evaluation
4th Edition, Sage 2008 (22pp)
14. Louisa
Gosling with Mike Edwards (2006). Tool 3 "Logical framework
analysis" in A practical Guide to planning, monitoring,
evaluation and impact assessment. Save the Children. (13pp)
Causation
In human services
it is often difficult to show cause and effect relationships
in programs. Evaluation of programs needs to deal with the causation
issues in some way. The first reading suggests ways of gathering
evidence about causation; the second describes the connection
between causation and research design. The third brings together
these and other related ideas.
15. E. Jane
Davidson "Dealing with the causation issue" Chapter
5 in Evaluation Methodology Basics The Nuts and Bolts of
Sound Evaluation, Sage 2005 (18pp)
16. David de
Vaus (2001) "Causation and the Logic of Research Design",
Chapter 3 in Research Design in Social Research. Sage.
(19pp)
17. Paul Bullen Finding
cause and effect in human services web
site
Qualitative
interviewing
Quality qualitative
interviewing requires good interviewing skills. This reading
provides very practical advice for interviewers.
18. Michael
Quinn Patton "Qualitative interviewing" Chapter 7 Qualitative
Research and Evaluation Methods 3rd Edition, Sage 2002 (88pp)
Running
a focus group
This book
provides practical tips on running focus groups.
19. Rosaline
Barbour Doing Focus Groups, Sage, 2007 (170pp)
Surveys
Surveys and
questionnaires are often a very cost effective evaluation tool.
This reading provides an overview of the survey process.
20. Earl Babbie
(2007). Chapter 9 "Survey Research" in The Practice
of Social Research 11th Edition. Thomson Wadsworth. (40pp)
Composing
questionnaire questions
People designing
surveys often (rightly) worry about whether the questions are
'good questions'. This reading provides many examples of 'wrong'
and 'right' questions.
21. P Alreck & R
Settle "Composing questions" Chapter 4 The Survey
Research Handbook Second Edition Irwin McGraw-Hill 1995
(26pp)
Cognitive
interviewing
A very useful
way of testing the appropriateness of questions in a questionnaire
it to do cognitive testing - ask some people to complete the
questionnaire in a face to face interview where they think out
aloud and tell you their responses to the questions as they attempt
to answer them. This reading provides practical step by step
instructions on how to do this.
22. Gordon
B. Willis Cognitive Interviewing in practice Chapter 4 Cognitive
Interviewing A Tool for Improving Questionnaire Design (20pp)
Gathering
data, making measurements
There are many
different kinds of data one can gather in an evaluation process.
This reading provides an introduction to data and things to consider
when thinking about data as indicators.
23. Peter R.
Scholtes (1998). "Keeping Track: Measurement of improvement,
progress and success". Chapter 7 in The Leader's Handbook. A
Guide to Inspiring your people and managing the daily workflow. McGraw-Hill.
(30pp)
Results
based accountability
Results based
accountability (RBA) in human services is a systematic way of
thinking things through to take action to improve programs, agencies
and service systems to help make people better off and/or to
improve the quality of life in communities, cities, states and
nations. It is a systematic way of thinking things through to
take action.
It is a useful
tool in thinking about evaluation, improvement and data collection.
24. Mark Friedman
(2005). Chapter 1 "What is Results Accountability and How
does it work?" and Chapter 2 "The Building blocks of
results accountability "in Trying hard is not Good Enough. Trafford.
(28pp)
Evaluating
advocacy
Advocacy is
not a typical human service. Here are some readings and tools
specifically about evaluating advocacy.
25. Louisa
Gosling with Mike Edwards (2006). Chapter 11 "Planning,
monitoring and evaluating advocacy" and Tool 13 "Frameworks
to help analyse the advocacy process" in Toolkits A
practical Guide to panning, monitoring, evaluation and impact
assessment. Save the Children. (26pp)
Collecting
life histories
In human services
it is often important to understand the life story as a context
of service delivery and its impact.
26. Robert
L. Miller (2000). Chapter 4 "Collecting Life Histories" in Researching
Life Stories and Family Histories. Sage (39pp)
Quantitative
analysis
When analysing
quantitative data is is important to understand the different
kinds of questions that can be asked and what kinds of analysis
are required to answer the questions.
27. Paul Bullen Quantitative
analysis: Questions to ask
Deciphering
Data and reporting results
How do you
analyse the data you have gathered and report on it?
28. Michael
Quinn Patton "The Meanings and Reporting of Evaluation Findings",
Chapter 13 in Utilization-Focused Evaluation 4th Edition, Sage
2008. (50pp) |