Skip to main content
Search form
  • 00:04

    [RESEARCH METHODS tutorial][How to Evaluate Interventions in Health Promotion]

  • 00:14

    LOUISE WARWICK-BOOTH: Hi, my name is Louise Warwick-Booth.[Louise Warwick Booth, Reader, The Center for Health PromotionResearch, Leeds Beckett University]And I work at Leeds Beckett University in the Centerfor Health Promotion Research.And part of the research that I dois used in evaluation methods to assess interventions.So I'm going to talk to you today in this film about a step

  • 00:35

    LOUISE WARWICK-BOOTH [continued]: by step guide to doing evaluationin a health promotion context.Evaluation work is looking at interventionsand assessing them, so we can assesswhether an intervention is effective in termsof its final outputs.In terms of how it's working, or perhapswe might want to look at its process.So how does it actually work to get to the end result as well.

  • 00:59

    LOUISE WARWICK-BOOTH [continued]: And evaluation is incredibly important within healthpromotion contacts because we want to knowis this intervention helping people?Does it improve their health?And what can we learn about this interventionfrom evaluation research?So can we then pass on that knowledge and that learning

  • 01:20

    LOUISE WARWICK-BOOTH [continued]: to other communities?And how is it working for the people who are receivingthat intervention as well?Is it cost-effective as well?It's quite an interesting questionthat gets asked increasingly in evaluation research.[Health promotion evaluation - why evaluate?]

  • 01:43

    LOUISE WARWICK-BOOTH [continued]: We also evaluate interventions because wewant to add to the evidence base in health promotion itself.So we learn as we know more about what works for peopleand what works with, for example, specific communities.So in conducting evaluations, we are gathering evidenceabout what is important in health promotion work.

  • 02:03

    LOUISE WARWICK-BOOTH [continued]: And that's important to not just for communities in receiptof these interventions, it's alsoimportant in terms of policy making.So how do people know what is a good intervention to fund?How do they know that they're goingto get value for money that something is likely to work?How do you make those decisions?Well, you ideally make those decisions based on evidence,

  • 02:24

    LOUISE WARWICK-BOOTH [continued]: and that evidence comes from evaluation workin health promotion.There are different types of evaluation approachesthat we might use to look at interventions.So commonly we talk about outcome evaluation.So what is the outcome from this intervention?So an evaluation will just look at those outcomes.

  • 02:45

    LOUISE WARWICK-BOOTH [continued]: So for example, is this interventionhelping to improve health?And we might measure that in a number of ways.For example, are people less sick?We can take actual quantifiable measures and say,are people improving in terms of less blood pressure?Are people weighing less as a result of this intervention

  • 03:07

    LOUISE WARWICK-BOOTH [continued]: if that is one of its goals?So we can quantify and measure different health impactsin terms of the kind of end result of any intervention.OK, so that's one particular evaluation style of approach.We might also, for example, want to look at process.So not just the end result, but howis something actually working?

  • 03:29

    LOUISE WARWICK-BOOTH [continued]: Yeah, how does it work in practice,and what lessons can we gain from understandingthat process?So here an evaluation will focus perhaps more longitudinally.It will look at an intervention at the startand trace that through to look at how it's working?How is it being delivered?How has it been managed?Who's involved?What a worker's experiences as well?

  • 03:52

    LOUISE WARWICK-BOOTH [continued]: So there is an evaluation frameworkthat we can use when we are workingto collect evaluation evidence within health promotion, OK?So this is drawn from Green and Southwho wrote a book called Evaluation in 2006.So step 1, clarifying aims and objectives.

  • 04:12

    LOUISE WARWICK-BOOTH [continued]: Step 2, choosing indicators.Step 3, linking outcomes indicators and methods.Step 4, understanding context and process.Step 5, setting up data collection systems.And then finally, step 6, bringing it all together.

  • 04:36

    LOUISE WARWICK-BOOTH [continued]: So step 1, clarification.When will the evaluation take place, OK?So are you looking at an intervention towards the end?Is that summative?So you're going to just look at what's happenedand you're going to do that looking backwards.Yeah.Or are you going to have more time

  • 04:56

    LOUISE WARWICK-BOOTH [continued]: to be involved as an evaluator?So are you going to be able to be involvedat the start of an intervention, and thento follow the development of that interventionas it's applied in practice and to evaluatethe whole way through?Because then, if you are going to work in that way,then you will have more opportunity to, for example,feed into learning as the intervention is being developed

  • 05:18

    LOUISE WARWICK-BOOTH [continued]: and implemented, OK?You'll also then need to think about for clarificationpurposes, what is it that you are lookingat in terms of measures?Yeah, and are you looking at process as wellor are you just looking at outcomes?And that will depend upon the scale of the evaluationand clearly what you were being commissioned

  • 05:38

    LOUISE WARWICK-BOOTH [continued]: to do a lot of the time.You will also need to clarify the type of evaluation approachthat you are going to use.So you might be using a test and learn approach, for example.So are you going to help the intervention test somethingand then provide evidence about the learningfrom that, or perhaps you might use a realistic evaluation

  • 05:58

    LOUISE WARWICK-BOOTH [continued]: approach.So realistic evaluation comes from Pawson and Tilley's workand it often involves using a theory of change approach.So you model a theory of change in termsof the context of the intervention,the mechanism, the way in which it's going to be delivered,and then you look at what that leads toin terms of an outcome.

  • 06:21

    LOUISE WARWICK-BOOTH [continued]: [Theory of Change]If we use a theory of change approach,we are going to improve the rigor of whatwe do from a methodological point of view.So we're going to really look at context.We're going to look at mechanism,and then we are going to look at outcomes.

  • 06:41

    LOUISE WARWICK-BOOTH [continued]: And the way that we can use this approachis that we can develop a theory of changeat the start of an evaluation.And then we can test that theory of changeas we work through data collection.So how do you create a theory of change?Well, one way if you are working in quite a co-productive wayis you can involve, the people who

  • 07:01

    LOUISE WARWICK-BOOTH [continued]: are delivering that intervention whoare responsible for that intervention.You can involve stakeholders, and partners,and external organizations, for example.And you can have conversations with those, either individualsor groups, and then you can ask them about their viewsabout how this is likely to work and what'simportant from their point of view.

  • 07:23

    LOUISE WARWICK-BOOTH [continued]: And then you can include that in your theory of change,in terms of thinking about what you're going to measureand how are you going to gather evidence.I'm going to talk you through an example of a theory of changethat I developed and used as part of a health promotionintervention.This theory of change is related to an evaluationof a vulnerable populations intervention.

  • 07:46

    LOUISE WARWICK-BOOTH [continued]: So this is an intervention that isfocusing upon four different vulnerable populations, OK?And these four different vulnerable populationsare travelers and gypsies, the homeless community,sex workers, and ex-offenders who have recently

  • 08:08

    LOUISE WARWICK-BOOTH [continued]: been released from prison.And so what we're looking at hereis in terms of the aim of the intervention,it's an overarching intervention that'sdealing with all these different for communities.And the overall aim for each of those communitiesis to support clients to have improved knowledge of services,

  • 08:29

    LOUISE WARWICK-BOOTH [continued]: and to use those services appropriately as well.So if you need support, are you going to the right service?Because that is much more of a cost effective approach.For example, than consistently accessing emergency servicecare.So what is the mechanism for changefor these vulnerable populations?The mechanism for change here is engagement.

  • 08:51

    LOUISE WARWICK-BOOTH [continued]: Its support from a specialist projectwho has an understanding of workingwith that vulnerable community.And they're going to just providededicated staff time and a worker to these individuals,OK?And so how do we think that this interventionis going to work for those who are accessing it?

  • 09:13

    LOUISE WARWICK-BOOTH [continued]: So this is about changing the environmentfor these individual groups of people.So through that and through interactions with their projectworker, through building positive relationshipsand developing trust with an individualthat they can go to for help.These people in receipt of supportare then able to access appropriate services

  • 09:36

    LOUISE WARWICK-BOOTH [continued]: because they're referred along the right linesrather than being uncertain of which service to access.Where if you need housing support, you go to this area.If you need health care, we work in this way.And this is the service that you can access there.And then the intervention is obviouslyaiming to improve the health of each

  • 09:57

    LOUISE WARWICK-BOOTH [continued]: of these vulnerable populations as well.How do we know if this intervention is going to work?Well, we're going to look at outcomes for those serviceusers, OK?So those outcomes, are they better engagingwith services in a more appropriate manner?And is their health improving as well?

  • 10:18

    LOUISE WARWICK-BOOTH [continued]: Do they know where to go to get support?So is their knowledge improved?That's quite important to make surethat they are going to the right places as well.So that's outcomes for service users.What about outcomes at an organizational levelfor the specialist organizations involvedin delivering this intervention?What might they be?

  • 10:39

    LOUISE WARWICK-BOOTH [continued]: Do they actually gain knowledge about how to better workwith community members?What works for some community membersdoesn't work for others?And that's important to recognize that whenwe are supporting people from vulnerable communities.So we might need different styles of work.For example, do we need to work in a way thatinvolves outreach?

  • 10:59

    LOUISE WARWICK-BOOTH [continued]: Yeah, for some communities.Do the communities need to have support providedwith them in a different way?So information giving might work wellfor some communities in a written form.For others, it may be that they need that communicatingverbally.So what's the learning there for organizations as well?Step 2, and evaluate in health promotion interventions

  • 11:22

    LOUISE WARWICK-BOOTH [continued]: is how to choose indicators?So how are we going to measure something?What are we going to measure?And are we going to measure that in a quantitative way?So for example, are we going to use tools like a questionnaireor are we going to take independent health measures?So we're going to look at people's weight, bloodpressure, things like that.

  • 11:42

    LOUISE WARWICK-BOOTH [continued]: Or are we going to measure things in a qualitative wayin terms of speaking to people, asking them their opinion?And that could be, for example, service users.It could also be staff members as wellwho are involved in the delivery of that intervention.So for example, if we want to look at an intervention working

  • 12:03

    LOUISE WARWICK-BOOTH [continued]: with young people to know how it'sworked for those young people, wecan qualitatively speak to those young peopleand say, has been involved helpedyou to improve your self-confidence, for example.And they can rate their self-confidence improvementsas part of that evaluation data set.

  • 12:24

    LOUISE WARWICK-BOOTH [continued]: Now we're going to move on to exploring step 3.This step is thinking about how you'regoing to link the outcomes, indicatorsand methods in your evaluation?So you need to think about, what itis that you are going to measure in terms of an indicator?So if you are looking at somethinglike, has this intervention enabled someone

  • 12:46

    LOUISE WARWICK-BOOTH [continued]: to improve their sleep, then how do you then measure that?And you could perhaps ask people to self reporton that in a qualitative way.Or you could, for example, try and quantify thatand get people to keep a record the hours that they slept,and to map that across a particular time period.

  • 13:10

    LOUISE WARWICK-BOOTH [continued]: So here, you are particularly interested in measuringhow change has occurred.Yeah.You can measure change in a number of different ways,but you perhaps want to know how that change has occurredand how you are going to detail that within your evaluation,and link that to the methods that you are going to use.So are you going to detail that change through, for example,

  • 13:33

    LOUISE WARWICK-BOOTH [continued]: interviews?Or are you going to report that change through using thingslike survey data?So let's now turn to step 4.This is about understanding context.So each intervention that we often evaluateis located within a different context.Context here can mean geographical location.

  • 13:55

    LOUISE WARWICK-BOOTH [continued]: It can mean a different organizational context as well.So you need to understand the context,in order to really understand processand to work out what's happening within this intervention.So you can understand context in a number of different ways.And that may involve working with the people who

  • 14:16

    LOUISE WARWICK-BOOTH [continued]: are delivering that intervention in quite a co-productive way.So they may be quite involved in developing the evaluationtools, or perhaps depending upon their experience of research.They may be more distant from that,and therefore you may offer a more traditional approachto an evaluation where you remaindistant as a research position.

  • 14:38

    LOUISE WARWICK-BOOTH [continued]: And you just collect data from those individualsrather than them helping you to co-produce and designwhat you are doing.When we're thinking about our data collection methods,we need to have a plan for, who needs to be involved?Who are we going to use those methods with?So are we going to use those methods with service users?

  • 14:58

    LOUISE WARWICK-BOOTH [continued]: Are we going to use those methods with staff?Are we also going to use those methods with peoplebeyond the boundaries of that intervention?So for example, we might want to gather datafrom people who have made referrals into that project.We might want to gather data from family members of peopleinvolved in that intervention.So do we want to talk to carers and get their perspective?

  • 15:20

    LOUISE WARWICK-BOOTH [continued]: So we have to think about who we need to speak to.And we also need to think about the purposes of the datacollection tools that we are using there.So when are we going to use those as well and why?So are we going to use those at the end of an intervention?Are we going to try and use different methods throughout?

  • 15:40

    LOUISE WARWICK-BOOTH [continued]: And if you are involved in an evaluation throughout,then you have an advantage because youcan track service users throughout an intervention.And so you can use those tools at the beginning,and then you could use them later on to quantify a change.And you might have access to more service users as well.For example, as they work through that intervention,

  • 16:03

    LOUISE WARWICK-BOOTH [continued]: if it's been delivered for a longer period of time.If you are coming in to do an evaluation at the very end,then you may have less access to peopleand you will be more restricted in termsof using those methods at the end of that intervention.Let me take you through an example of data collectionsystems used within the vulnerable populationsevaluation.

  • 16:24

    LOUISE WARWICK-BOOTH [continued]: So the example of the theory of change that Italk to you about already.So what data collection did I use in this evaluation?A number of different types of data collection.So monitoring data, OK?First of all, the people who are from each vulnerable community.How often are they attending appointments?

  • 16:46

    LOUISE WARWICK-BOOTH [continued]: So how much service support are they being given?So that's what we call monitoring data.That's often collected by workerswho are involved as a project workerdelivering an intervention.So they keep records of perhaps the number of times I'vespoken to someone on a phone.The number of times that they have dealt with a personphysically in the same room, the kind of support they're given.

  • 17:08

    LOUISE WARWICK-BOOTH [continued]: The type of referrals that they mightbe giving to those individuals.So why are they referring them onto?So we gather that monitoring data,so we have an understanding of the type of support,the amount of support, and the way in which it's been given,OK?So one component of that data collection.Perhaps, too, we also need to gather data on services.

  • 17:30

    LOUISE WARWICK-BOOTH [continued]: So we did here.We gathered data on service usersfrom the perspective of staff, OK?So staff produced case studies about service usersto document their progress from the supportthat they had received.So case studies were populated into a templatethat I created as part of an evaluation tool

  • 17:51

    LOUISE WARWICK-BOOTH [continued]: in this particular instance.So staff then complete that, and then thatcan be analyzed and included in the data set.What to about service user of experience and staff learning?Both of those, we can gather through usein qualitative interviews.So in this particular project, therewere interviews with both staff and with service users as well.

  • 18:14

    LOUISE WARWICK-BOOTH [continued]: Finally, a little bit more quantifiable data collectionwith those vulnerable population members,with those service users.How do we know that their health is improving, OK?Well, we can try and assess that through using questionnairetools.So in this instance, we use two different questionnaires,because we weren't sure what was going

  • 18:36

    LOUISE WARWICK-BOOTH [continued]: to work with these different populations.So actually here, we are also experimenting with methodsto try and find out what suits these different communitiesthe most.So we used a health measure called an EQ-5-DL, whichis a health questionnaire.Any of you who perhaps have had a hospital appointment or a GPappointment may well have completed this questionnaire.

  • 18:58

    LOUISE WARWICK-BOOTH [continued]: It's used frequently in health care settingsand we use that baseline.So at the first appointment that someone had,and then later on a follow up to seeif there'd been a change, OK?And the second tool we used is called the chaos index,which was a tool suggested to us by one of the service

  • 19:19

    LOUISE WARWICK-BOOTH [continued]: providers, because they particularlyfelt that that would work with a vulnerable population.So this is again a questionnaire that's already validated,and we use that across all of the population groupssay, the former client groups.Again, at the first interaction with the project supportworker, so baseline.And then later follow up period six months later

  • 19:39

    LOUISE WARWICK-BOOTH [continued]: to see if we can quantify change in terms of health improvementfor those community members.We're now moving on to final step, step 6.How do we bring it all together?Yes, so how do you then, what do youdo with all this information and evidence that you have,and where is it going to go?So as a standard, one of the things

  • 19:60

    LOUISE WARWICK-BOOTH [continued]: that we often do as evaluators is weproduce a final written report.And then that report will go to the people deliveringthe intervention.It will often be shared, for example, with service users.It might be shared amongst partnersand external organizations.And most certainly, the people who funded any intervention

  • 20:20

    LOUISE WARWICK-BOOTH [continued]: will want to have a look at that report as well.They'll want to know what evidence you have of change,and what learning there is as well.Funders vary in terms of their expectationsaround these reports.And some funders are very, very supportiveand are really keen and interested to knowwhat learning can be gathered from the evaluation data,

  • 20:41

    LOUISE WARWICK-BOOTH [continued]: because then they can take that forwardinto future interventions as well.And we may also use material beyond reports.The report is often the first stageof a dissemination process.So we might have events where we have conferences.We involve people locally in talking about those

  • 21:03

    LOUISE WARWICK-BOOTH [continued]: and the evaluation results.And then, of course, as academics, we oftenpublish in journals and at conferences as well.Depending upon the methods that you've used,you may have material beyond just to report.So if you use some more creative methods in your evaluationwork, then you can use those as well in the dissemination

  • 21:24

    LOUISE WARWICK-BOOTH [continued]: process.So when you are bringing it all together,there may be issues because you mayhave evaluation evidence that shows that an intervention doesnot work.So if something doesn't work, how is that thengoing to be received?How often does respond?What about the people who have been involved in deliveringthat intervention as well?

  • 21:46

    LOUISE WARWICK-BOOTH [continued]: You may also experience issues in termsof how your data is received in your final valuation reportwhen you have found that something partially works,but perhaps doesn't fully all that there areissues within any intervention.So there's a political element to this.Political with a small p.

  • 22:07

    LOUISE WARWICK-BOOTH [continued]: Produced in a report, for example,may not be the end in terms of conversationsthat you are having with the peopleyou have been involved in that evaluation working with.[Conclusion]You've been given a kind of stepped planto think about how to use evaluations

  • 22:28

    LOUISE WARWICK-BOOTH [continued]: within health promotion.Evaluation is complicated.It takes thought, and theory of changeis a really useful model for helpingyou to make sense of any interventionand then to link it all together.But yeah, it's challenging territory,and it can be political.So it's really, really interesting.

  • 22:49

    LOUISE WARWICK-BOOTH [continued]: Thank you for watching.And do take a look at the suggested readingso that you can learn more.


Louise Warwick-Booth, Reader, Centre for Health Promotion Research at Leeds Beckett University, presents a theory of change approach and step-by-step guide to doing evaluations in a health promotion context, including reasons to evaluate, clarification of aims, indicators, linking outcome and methods, understanding context and process, data collection, and bringing it all together.

Looks like you do not have access to this content.

How to Evaluate Interventions in Health Promotion

Louise Warwick-Booth, Reader, Centre for Health Promotion Research at Leeds Beckett University, presents a theory of change approach and step-by-step guide to doing evaluations in a health promotion context, including reasons to evaluate, clarification of aims, indicators, linking outcome and methods, understanding context and process, data collection, and bringing it all together.

Copy and paste the following HTML into your website