STAKEHOLDER PARTICIPATION GUIDE 20
Participation: finding out
what difference it makes
The Social Care Institute for Excellence (SCIE) was
established by Government in 2001 to improve
social care services for adults and children in the
United Kingdom. We achieve this by identifying
good practice and helping to embed it in everyday
social care provision.
SCIE works to:
• disseminate knowledge-based good practice
guidance
• involve service users, carers, practitioners,
providers and policy makers in advancing and
promoting good practice in social care
• enhance the skills and professionalism of social
care workers through our tailored, targeted and
user-friendly resources.
STAKEHOLDER PARTICIPATION RESOURCE GUIDE 07
Participation: finding out
what difference it makes
First edition published in Great Britain online at www.scie.org.uk in June 2007,
this edition published September 2007 by the Social Care Institute for Excellence
© Social Care Institute for Excellence
All rights reserved
978-1-90481240-1
Written by Mark Doel, Chris Carroll, Eleni Chambers, Jo Cooke, Anne Hollows,
Linda Laurie, Linda Maskrey and Susan Nancarrow
This report is available in print and online
www.scie.org.uk
Social Care Institute for Excellence
Goldings House
2 Hay’s Lane
London SE1 2HB
tel 020 7089 6840
fax 020 7089 6841
textphone 020 7089 6893
www.scie.org.uk
Please contact SCIE if you would like this resource guide
to be made available in a different format.
iv
Participation: finding out what difference it makes
CONTENTS
Summary
1
Section 1
About this guide
About the research
What is evidence of success?
Terms used in the guide
4
4
5
7
8
Section 2: Nine big questions
1 Why bother to evaluate?
2 What stops us from finding out whether
participation makes a difference?
3 What do we mean by making a difference?
4 When do we decide to find out whether
a difference is being made?
5 Who says?
6 How do we find out?
7 What tools and resources do we need?
8 What about differences?
9 What happens next?
11
11
17
19
21
23
25
28
Section 3
Checklist of pointers
Numbered references
Toolkits and further references
Other references
Additional references: difference and diversity
Practice sites
30
30
32
33
38
39
42
13
15
v
Acknowledgements
With especial acknowledgements for the help, advice and support of the advisory
group of service users and carers, who included Christine Barton, Emma Davie,
Rachelle Heslop, Luminous Makumbe, Alan Meadows, Ian Porritt and Marjorie Quine,
and to Sheila Wallage for her administrative support. Our appreciation, also, of the
time and patience of the practice sites whose experiences have contributed to this
Resource guide; and to Pete Fleischmann, Principal Advisor Participation, Social Care
Institute for Excellence, for his valued support.
vi
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
Participation: finding out
what difference it makes
The project team
Mark Doel (Project Lead, Sheffield Hallam University)
Chris Carroll (ScHARR, University of Sheffield)
Eleni Chambers (Service User Researcher)
Jo Cooke (Trent RDSU)
Anne Hollows (Sheffield Hallam University)
Linda Laurie (Service User Researcher, ‘New Perspectives’)
Nicola Maskrey (Service User Reviewer)
Susan Nancarrow (Sheffield Hallam University)
vii
viii
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
Summary
Introduction
This guide is based on research commissioned by the Social Care Institute for
Excellence to develop measures that can be used to help evaluate the impact of
service user and carer participation.
With service user and carer participation firmly on the agenda, there is a need to find
out what difference service user and carer participation is making. No matter how
right participation is, we also need to know how we can measure the differences that
it is making. Whilst the service user participation movement has achieved much in
terms of the principle, it is less clear what changes have resulted in practice.
Purpose
• To find out what ways service user and carer participation is being evaluated.
• To suggest ways of finding out what difference service user and carer participation
is making to social care services.
Methodology
The research that informs the guide focuses on three main areas:
• What does the available literature tell us about how participation is being
evaluated?
• What can we learn from examples of service user and carer participation (the
'practice sites')?
• Are there 'toolkits' that can help individuals and organisations to find out what
impact participation is having?
The literature review built on the work of SCIE Position Paper 3 (2004), Has
service user participation made a difference to social care services?, so searches
were conducted electronically and manually of reviews from 2001. Studies were
excluded if they simply reviewed service user involvement without any evaluation
of the participation. Thirty key reviews met the criteria for inclusion and these were
analysed using a standard pro-forma developed by the team of academic and service
user researchers.
In order to access the ‘grey literature’ and to identify ten practice sites as examples
of evaluation of participation, 1599 social care organisations were contacted across
England, Wales and Northern Ireland in the summer of 2006. Thirty responses were
received and from these and other ‘snowballing’ techniques, ten practice sites were
1
selected. These criteria were used to help the selection: geographical spread; client
group spread; recency of evaluation; variety of evaluation methods being used.
From the literature review, the practice survey and informal methods, twelve toolkits
were studied in more detail.
An Advisory Group of service users and carers was facilitated by a service user
researcher and gave advice to the research group about specific elements of the
research.
Findings
The research pointed very clearly to a gap between participation of service users and
carers (considerable activity) and systematic evaluation of what difference this is
making (relatively little). This gap can be seen both in the literature and in practice,
and is probably one of the reasons for the very low return rate from the practice
survey.
In part, this gap between participation levels and evaluative activity can be explained
by the barriers. If we understand these barriers, we can begin to overcome them in
order to make evaluation an essential part of participation. The main themes are
listed here.
• Power differences between professionals and service users can make honest
evaluations difficult to achieve; power is also important in terms of who sets the
stage for the evaluation (who decides what will be evaluated and how?).
• Expectations about what will be evaluated might be unclear; for example, is it the
process of participation or the outcomes or, more likely, both? Intrinsic benefits
of participation (the value of participation in itself) are linked to, but also separate
from, extrinsic benefits (the results of participation).
• Evaluation needs to be built in from the beginning, along with the resources to
conduct it. Although project funding might include evaluation, often it does not
include evaluation of participation.
• Participation is like breathing for many organisations that are led by service users
and carers, so it can be hard to know what aspects to evaluate.
• It is difficult to know whether 'A' caused 'B'; in other words, did this participation
here cause that difference there?
• Commitment to the principle of participation can make it difficult to be objective
about the difference it is making or, indeed, whether it is making any difference at
all.
• Effective evaluation might require training (e.g. for service users to become
research interviewers) or support (e.g. to ensure that the experience of evaluation
is constructive and not hurtful).
2
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
• The culture in an organisation, including staff attitudes, can be hostile to
evaluation. There may be fears, real or not, about what evaluation of participation
will discover.
• It can difficult to include people who are seldom heard in the evaluation.
• Tokenism occurs when an organisation feels satisfied that it has ticked the boxes,
yet the reality is experienced very differently by service users and carers.
• There are different timescales for service users, carers, workers, managers and
researchers. One reason for the low response rate to the survey in this research
was probably the fact that the timescale was too tight for most service user-led
organisations that would want to consult with all their members.
There are different kinds of evaluations for different kinds of purpose. Some of
the evaluations included in the practice survey were on-going and these fell into
two categories: those that had a continuing commitment to evaluation as part of
a ‘quality loop’, or those that had moved into a new phase of evaluation, building
on the learning from the first phase. Some of the sites involved the evaluation of a
specific, time-limited project or one project moving into another, in which evaluation
had been built into the terms of the project and, importantly, its budget.
Making use of the findings
It is not possible to declare which methods of evaluation are best for what kinds of
participation. However, ‘nine big questions’ did emerge from the research findings,
along with a list of twenty pointers. If individuals and organisations ask themselves
these questions and address these pointers, they will be helped to develop the most
fitting approach to evaluating the difference that participation is, or is not, making.
The ‘nine big questions’ are outlined in detail in Section 2 of this guide. The twenty
pointers are listed in Section 3. With responses to these questions, individuals, groups
and organisations will be better equipped to develop measures to evaluate the
effectiveness of service user and carer participation.
3
Section 1
About this guide
Who is the guide for?
This guide is written for any individual, group or organisation wishing to find out
whether service user and carer participation is making a difference.
Finding your way around the guide
Section 1
The first section tells you about the purpose of the guide. The research that informs
the guide is briefly described. We consider the nature of evidence; in other words,
what might show that participation has made a difference. We look at the idea of
success and how this is not as simple as it seems. Any words in blue are listed in the
fourth part of Section 1, with further explanation.
Section 2
We suggest that there are ‘nine big questions’ that have to be answered, or at least
asked, when you are finding out whether service user and carer participation has
made a difference. Each ‘big question’ is discussed in turn, with a summary and two
boxes. The Findings box lists what the research suggests can help begin to answer
the ‘big question’. The language in the Findings boxes is, therefore, sometimes more
formal than in the rest of the guide. The Ideas box aims to make the process of
answering the big question more creative and involving.
Section 3
There is a checklist of twenty pointers to help you find out whether participation is
making a difference. Also in this section is a list of research reviews and toolkits that
you might like to refer to, as well as a description of the practice sites.
Participation and evaluation
There are many ways of being involved in social care:
•
•
•
•
•
•
4
as someone who uses services
as someone who cares for a service user
as a worker
as a manager
as a researcher
as a policy maker.
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
Social care might be provided by service users and carers, by volunteers or workers,
or a combination of these. The services might be provided in the state, charitable or
private sectors. Social care is a big part of our lives and, like health and education, it
is something that everybody is likely to have some contact with during their lives.
In recent years the people who use services and their carers have had a greater say.
This service user and carer participation is meant to improve services by listening to
what people want and by acting on this information. In some cases, service users are
now organising their own services too. Most people think that participation is a good
thing, but:
•
•
•
•
how do we know whether participation makes a real difference?
what are the costs and what are the benefits?
what are the best ways to find out whether participation is making a difference?
what can we learn from the way that other people have done this?
This guide has been written to help answer these questions. It is based on research to
discover what is already known about the evaluation of participation.
About the research
Purpose of the research
We wanted to find out what is already known about the difference that service user
and carer participation can make and how people have gone about investigating this.
We were especially interested in how the difference was being measured.
When you ask a question such as ‘how do you find out whether participation is
making a difference?’ you find yourself asking even more questions rather than
providing one short answer. So, the guide is not so much about answers but
about making better sense of the questions. To do this, we will present ‘nine big
questions’ that the research suggests are important to ask when finding out whether
participation has made a difference.
The research methods
In order to find out what is already known about this topic, we:
• searched for reviews of this topic (the work that other researchers have done to
compile knowledge about evaluation of participation)
• read and digested 30 reviews in total
• reviewed twelve practice guides to evaluating participation, which we call toolkits
• sent a short questionnaire to 1,599 different social care organisations in England,
Wales and Northern Ireland (we had 30 replies)
• issued a press release about the research (we received 12 responses)
5
• interviewed key people at ten practice sites where they had done or were doing
evaluations of participation
• were steered by an Advisory group of eight service users and carers, facilitated by
a service user–researcher.
Understanding the research findings
We discovered that much more has been written about how people participate
than about how we find out what difference participation makes. Even so, some of
the methods used to help people participate can also be used to help find out what
difference it has made. We have collected all of this information into the ‘nine big
questions’, which you will find in Section 2 of this guide. We hope this will help both
to understand the findings and to make practical use of them.
Crawford et al (2002, R08) point out that the ultimate goal of service user
participation should be the promotion of health, quality of life, or overall user
satisfaction with services. However these outcomes are often difficult to measure,
they can take a substantial amount of time to become evident, and the link with
the participation of services users and carers can be difficult to prove. These can be
barriers to evaluation (see big question 2). As a result, evaluations tend to use short
term indicators.
Crawford (2003: p79 R01) developed four categories of outcome, where these have
been evaluated:
•
•
•
•
increased satisfaction with services
promotion of further user involvement initiatives
improved management
changes to service priorities.
Evaluations can be usefully considered as focusing on ‘voice’, ‘choice’, and ‘change’
(R09). Each of these constructs lends itself directly to an evaluatory question: ‘did
they listen?’, ‘did I get what I wanted?’ or ‘did the service change?’ Our research
suggested two kinds of benefit from participation – intrinsic and extrinsic. Intrinsic
benefits come from the process of participation itself, such as improvements to
self-esteem and changes in attitude. How much these intrinsic benefits are valued
varies, with Truman (2005: p572 R14) suggesting that ‘user involvement should
not be seen as an end in itself but rather it is a means of enabling people to make
choices and have control over their daily lives’. So the intrinsic value of service user
participation might also have an impact on extrinsic changes; for example, increased
self-confidence gained via the process of participation (intrinsic gains) might be
necessary before people have confidence to campaign for specific changes to
a service (extrinsic gains).
6
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
An example of an intrinsic benefit of participation is: ‘children have said that having
a say is more important than getting what they want’ (R07).
An example of an extrinsic benefit of participation is: ‘new services were developed
and costs of care were reduced’ (R09).
Open communication is a crucial part of the process of evaluation, but the power
to provide a service (and stop providing it) makes it difficult to have an equal
relationship between service users, carers and professionals. This is why the question
of who does the finding out (who evaluates) is so important (see big question 5).
Although there are many different methods that can be used to find out what
difference participation is making, we do not yet have enough evidence to know
which method is best for which situation (see big question 6). Networks, whether
service user and carer or professional, are important in providing strength and
support, but it is also necessary to reach out to people who are not part of a network
or group (‘seldom heard’ people).
What is evidence of success?
Different views of success
We are expected to look for evidence of what works well and to use this. This is called
evidence-based practice. ‘That’s how we’ve always done it’ is not good enough; we
should know what evidence there is to support what we are doing. However, there
are a lot of things we do not have much evidence of, and most situations are very
complicated, so the evidence is not simple. Taking the following everyday example
of a house extension, what might be the evidence that it was successful?
• The people living in the house might say it is a success if it gave them the extra
space they had hoped for.
• The person owning the house might think it is a success if it was built on time and
within the agreed price and it increased the house's value.
• The builder might judge it a success if it has made a profit.
• The architects might see success if their plans have been followed exactly.
• The local planning office might judge success if the extension meets all the
regulations and planning laws.
• Neighbours might sense success in an extension that doesn't shade their garden.
• People across the street might see success in an extension that is in keeping with
the neighbourhood.
• Family and friends of the people living in the house might feel success if it gives
them a comfortable spare room to stay in when they come to visit.
7
From this example we can see that evidence of success needs to take account of
many different views. Also, an opinion might change depending on when it is asked
for. During the building of the house extension you might feel positive because the
builders are involving you in making decisions. Or you may feel unhappy because the
builders’ work is very messy. Evidence of success is not just about what happens in
the end (outcome) but it is also about how we all got there (process).
The meaning of successful participation
The answer to the question ‘how do we know whether being involved has made
a difference?’ is not easy. It depends on:
•
•
•
•
•
•
•
•
•
who we ask ... and who does the asking
when we ask the question
how we ask it
what we ask about
how we feel about being asked (what has our past experience been?)
whether we think it will make a difference if we bother to answer
whether we feel we can be honest
the different power of the people involved
what 'being involved' has been like in the past.
Terms used in the guide
Some of the words used in this guide are explained in more detail here.
Advisory group
The group of eight service users and carers who helped to advise the research and the
production of this guide.
barriers
What gets in the way of finding out what difference has been made by taking part
and joining in.
benchmark
Finding out exactly how things are now, so that you will be able to know whether
anything has changed later on.
communication
Communication is the way we understand what other people mean and let them
know what we mean; it needs to be open and honest.
evaluation
The term used for finding out how something is working, whether it has led to any
changes and what kinds of difference these changes have made.
8
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
evidence
Proof that things have changed; examples of the differences that participation has
made.
extrinsic
A benefit that arises from some identifiable end result or outcome. It is usually
specific and is something that is evident.
findings box
The main messages from the research are given in these boxes (for each of the big
questions in Section 2). Follow up the reference numbers in blue.
grey literature
This term is used to describe reports, toolkits and other documents that have not
been published but are very useful.
ideas box
Suggestions to help you evaluate the difference that participation makes (one box in
each big question).
indicators (milestones)
Signs to show that there is progress being made and that changes are beginning to
happen. These might be different for different people.
intrinsic
A benefit that arises from the process of participation itself. It may not be evident
except to the person who feels the benefit.
involvement
Another word for taking part, though involvement is not always seen as quite so
active as ‘participation’.
network
Networks are strong links between groups of people. They can link many different
groups of people together, which can give them more power by acting together and
sharing information.
outcome
The results of taking part. The outcome is the end result and is usually specific and
planned.
participation
The general word for getting actively involved, joining in and taking part.
9
power
Power can come from many sources, and it helps you to do what you want to do.
Differences in power (for example between service users and paid workers) can stop
participation from being successful if the differences are not changed.
practice sites
The organisations that were contacted as part of the research for more details about
the ways they were evaluating service user participation. References to the practice
sites are in blue, e.g. (P01), and listed in full in Section 3.
process
The outcome is the end result and the process is the way you get to the result.
Sometimes that journey can be as important as the arriving, so the experience of
participation is part of the evaluation.
research reviews
These reviews collect the findings from many other studies on a similar topic and
usually make comments on them. References to the reviews are in blue, e.g. (R01),
and listed in full in Section 3.
stages of participation
Adaptive and transformational (see T7 for further explanation).
toolkits
Toolkits are handy ‘how to’ guides with practical ideas about doing evaluations.
References to toolkits are in blue, e.g. (T1), and listed in full in Section 3.
10
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
Section 2: Nine big questions
Big question 1: Why bother to evaluate?
Are there good reasons for finding out whether and how participation
is making a difference?
Summary
The reasons for evaluating participation are likely to influence the way in which it is
conducted. Do you want to ‘prove’ that participation works or to describe what the
process was like? What would be the result of not finding out whether participation
has made a difference, and how will findings be used to make changes?
What is the likely balance of costs and benefits from finding out whether
participation has made a difference? It is important that services learn from the
experiences of those who use them, so that they can become more responsive.
Findings box 1
• Service user and carer participation might be an end in itself (R14) linked to
broader issues of citizens’ rights and democracy (R15).
• The question ‘why evaluate participation?’ is not the same as ‘why involve people
in the first place?’ (R16). The two questions are linked, especially where the main
aim has been service user involvement in research and evaluation (P09) (P10).
• Taking time to evaluate indicates the value of a service by gathering reliable and
valid information in a systematic way (R18).
• As well as the value to service users and carers, it is worth considering the
potential benefits to social care workers, such as feeling energised (P08).
• Evaluation is about communication – a dialogue in which people come to the
table to talk about ‘that which is of value, merit, worth or significance’ (T4,
p112).
11
Ideas box 1
How important might each of the following potential benefits be for your
organisation – and therefore, how might they help make the case that evaluating
service user and carer participation is worth the effort?
•
•
•
•
•
•
•
•
•
•
Improved access to services
Improvement in the quality and responsiveness of services
Better informed planning and development
Evidence of more accountability
Energised staff experiencing more job satisfaction
Increased opportunities for service users and carers to share both their
frustrations and their appreciation
Service users and carers feeling more valued and more confident
Service users and carers feeling they can make a difference
Improvement in the relationship between service users, carers and the wider
community
Funders have a better understanding of the service’s strengths and weaknesses
and what to do about this
(Adapted from T2, p7)
12
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
Big question 2: What stops us from finding out whether participation
makes a difference?
What are the barriers to evaluation?
Summary
There is more information about what stops people from participating, than what
stops people from evaluating (R15). The findings point to differences in power
between service users, carers, workers and organisations as a barrier to evaluation.
Consider which people are likely to have the most power in the house extension
example in Section 1 and how that might exclude some people from defining
‘success’.
Evaluation takes time, commitment, skills, resources and systematic planning, and
if any of these are not available it is likely to prevent the evaluation from happening
or from being successful. Evaluations need to be planned from the beginning, and
costed into any proposals.
Findings box 2
• Differences in power can be a barrier to honest evaluation (R19) (R20) and
evaluating means being prepared to accept findings that might change the power
balance and may be contrary to current policies (R17).
• Real or perceived fear of the costs of evaluating can be daunting, along with
concerns about additional costs that might be indicated by the findings (R17).
• Timescales may be different for different groups – service users, professionals,
organisations, researchers.
• The main focus of evaluation may be elsewhere, e.g. driven by the terms of the
project’s funding, which may not have specified evaluation of service user and
carer participation (P02) (P03) (P04).
• Poor motivation to get involved in evaluation, perhaps because of ill health or
past experience of it not making a difference (R02).
• Attitudes of staff may be hostile or unsupportive to evaluations (R05)
• The culture in the organisation is hostile or not supportive to evaluating
participation (R06).
• It is difficult to prove that that this change is due to that participation (R08).
• Practical matters can prevent thorough evaluations, such as lack of transport in
rural areas (Branfield et al, 2006).
• Psychological issues, such as seeing participation as something that you ought to
do, whatever the result, so why evaluate it, can be a barrier. If participation is a
requirement or a right what is the point of evaluating it? (R02) (R21).
• The tick-box mentality (if the box is ticked it feels like it has been done).
• An insufficiently clear plan for participation and evaluation means that there is
nothing tangible to measure.
13
Ideas box 2
What lies behind these quotes? How might they prevent you from finding out what
difference participation has made, if they are not confronted?
Nobody ever asks the paid
workers if their views are
‘representative’ ... can you
imagine asking that in the
middle of a meeting?
If you’re taken
seriously it makes
you feel good.
Things are slow
to change, but it’s
getting better.
If you complain
they say ‘don’t
threaten me’.
I’m the only
voice.
It took me a year to
get the gist of the
meetings – then I could
contribute, but by then
it was time to leave.
Without good
information we
can’t make the
choice which would
suit us best.
When there are too
many pages I can’t be
bothered, so I put it
in the bin.
People don’t go to
McDonald’s to cook
their own burgers.
14
Participation: finding out what difference it makes
That’s how it is in all
organisations.
STAKEHOLDER PARTICIPATION
Big question 3: What do we mean by making a difference?
Can all improvements be easily measured? If we just feel better or
understand more – is that a result in itself?
Summary
Evaluation is like opening up the ‘black box’ that aeroplanes use to track what
happens, though in social care the story of how the project or agency has been
working is more complex (T4; Baum et al, 1998) since finding out what differences
have resulted from service user and carer participation is also about finding out how
the participation has been making a difference. The participation might have been
about service users having a voice (being listened to), about having a choice (more
control over what the services they receive) or about making changes to the services
as a whole. How people feel about the way they participated can be as important as
the results of their participation.
Findings box 3
• There is a difference between finding evidence about the process of participation
and reviewing the outcomes of participation (R06).
• Taking part can have its own benefits apart from any specific changes that come
about as a result of participation. More studies focus on process rather than
outcome (R02).
• There can be confusion over what is being evaluated (R05). It works better for
outcomes to be realistic, measurable and specific (R06, p44).
• What is measured must be meaningful. Number crunching approaches such as
admission and discharge rates or financial activity were not favoured (R05, p22).
• There is a close relationship between the method of participation, the degree of
satisfaction and the extent of change. Processes and outcomes are not divorced
from one another (R07).
15
Ideas box 3
This Ideas box can be used by all the people involved in evaluating participation.
It will help to show whether you are more likely to be interested in outcomes
or processes, intrinsic or extrinsic benefits, or a combination. Giving specific
examples of the kinds of changes that are wanted or expected is designed to
develop self-awareness, which is important in sharing expectations and avoiding
disappointment.
I want to be listened to
I want to see results soon
I want to see changes
I want to see lasting changes
I want to have choice
I want practices to change
I want to feel involved
I want policies to change
Complete these sentences as a way of starting a dialogue about how you and others
can begin to know whether participation is making a difference.
An example of
a change that is
important to me
would be ...
An example of me
being listened to
would be ...
16
Participation: finding out what difference it makes
I would know things
were changing when
I noticed that ...
What I get most out
of taking part is ...
STAKEHOLDER PARTICIPATION
Big question 4: When do we decide to find out whether a difference
is being made?
Is the timing right? Have things had time to develop?
Summary
If you do not know where you have started from you cannot know how far you
have travelled. Finding out where you are at the beginning of the process is called
a benchmark. A plan with time limits and deadlines also reminds you when to take
measurements. This will indicate how far you have travelled. In the house extension
example in Section 1 of this guide, we imagined that people might measure progress
in their own different ways and at different times. Some of these indicators can be
used quite soon, whilst others might have to wait a long time before we could use
them to measure success.
Findings box 4
• Many of the changes that we might expect to see as a result of service user and
carer participation take a long time (sometimes called ‘a long horizon’). This can
lead people to focus on the short term before the changes have accumulated
(R06).
• Although a long term view is sometimes needed, it may be difficult to keep
evaluation going over a long period (R06).
• Evaluation, like participation, can be occasional rather than continuous.
• A successful outcome might be less about change for the better and more about
keeping things from getting worse (P07).
• We need to find out more about how the methods used to evaluate participation
might need to be different depending how far advanced the participation is
(R06).
• The ‘when’ question helps to recognise progress because you have to develop
indicators along the way (sometimes called ‘intermediate outcomes’) (R08).
• The Rickter evaluation model focuses on distance travelled rather than outcomes
(P01) (T8).
17
Ideas box 4
You have choices about when to evaluate. You can use the user-centred model
of participation below to consider at which stage it would it be best to find out
whether participation is making a difference. Given resources are limited what
would your priorities be?
(Adapted from T7, p51)
User-centred model
Communication
Trust and mutual
understanding between
users/staff/management
to enable dialogue
Professional
development
and management
support for front-line
workers
Feedback
What happened? If
nothing why not?
Commitment
to participation by
the organisation
EVALUATE
changes resulting
from participation as
well as the experience
of participation itself
Financial
and other
supports (e.g. training)
for participation
Participation
at appropriate
levels using a range
of methods and
models
Power
Awareness of
power differences and
willingness to share
power
Participation
as partners
rather than one-off
consultation
Building
different
service user and
carer groups
Dialogue
across different
sections in the
agency
Although this is a user-centred model, it can of course be used by all those involved
in evaluating service user and carer participation – workers, managers, and policy
makers as well as service users and carers
18
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
Big question 5: Who says?
Who does the evaluating? Will everyone get a say?
Summary
Who is involved in the evaluation and who will do the evaluating? How independent
will they be from the services they are evaluating? Who decides how to find out
whether participation has made a difference? Service users and carers should
participate in making decisions about who will be doing the evaluation and how it will
be conducted, but differences in power can affect this if they are not faced openly.
Care must be taken that evaluations are not felt to be destructive or hurtful and that
they are conducted in a safe way. Most of all, people who are ‘seldom heard’ must be
included.
Findings box 5
• Power is central to the question of who evaluates (R19) and power issues
underlie the majority of identified difficulties with effective user-led change
(R02).
• Whoever commissions the evaluation has a powerful voice both in what will be
evaluated and how, so participation by service users and carers in the decision to
evaluate is central to the participative process (R10) (R16) (R22) (P02).
• If service users and carers are involved in developing indicators to measure
progress, this could lead to definitions of quality that are more meaningful to
service users (R17).
• There is evidence that service users and carers are more likely to engage in
research that arises from their own questions and requests (R13) and response
rates may be improved.
• Who does the evaluating is important. It should be decided what competencies
are necessary and, therefore, what training in evaluation should be available
(R06).
• You need to think about whether evaluation should be independent and who it
should be independent from (P02) (P03). Is it appropriate for service providers
to evaluate their own services?
• It is important to consider who owns and who acts on the evaluation findings.
When written feedback was provided, it was invariably provided to parents
rather than to children (R04, p3).
• Consider who is responsible for the way the evaluation will be managed.
Coordinate the consultation with any others taking place at the same time or
covering similar topics or sections of the community (T3, p16).
• The participation initiative and the evaluation of it must have support from ‘the
top’ (T4).
19
Ideas box 5
Make it easier for people to join in and to find out what difference this is making by:
holding events during the day, during the evening and/or at weekends
providing a crèche
providing transport
making sure the location is accessible
assisting people with hearing difficulties (induction loop system and/or signers)
assisting people who do not speak English (such as using interpreters)
writing information in different forms, such as large print, Braille, tape, Easyread
and other languages
• finding out what food people can eat and want to eat
• making payments for people’s participation.
•
•
•
•
•
•
•
(Adapted from T1, p47)
20
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
Big question 6: How do we find out?
What methods might be used to find out whether participating has really
made a difference?
Summary
There are many different ways of finding out what the results of taking part have
been. The choice should be based on why the evaluation is taking place (see Big
question 1), who will be involved in the evaluation, what skills and resources are
available and the nature of the activity.
If several methods are used you can be more confident that the evaluation will be
accurate. Researchers put their findings to more than one test to try to make sure
that they are reliable. Using different methods might help different people to take
part; some people are more confident writing, others better at speaking. Children
might prefer drawing, for example, as a way of expressing themselves.
Findings box 6
• There is currently no best method of evaluating participation (T01).
• There is limited guidance, tools and knowledge about how services and
organisations can review the outcomes of participation (R06, p50).
• Where a toolkit is used it is important to know how it has been tried and tested
(Telford et al, 2004).
• The way the evaluation is done should reflect the values of participation (R06).
• Using a range of methods helps make sure that the evaluation paints a faithful
(valid) picture (R18). See Ideas box 6.
• Partnership approaches in which service users and carers join in the evaluation
(co-evaluators) (R10) or when they are the evaluators, can lead to more honest
feedback (R09).
• The way in which the evaluation is conducted, and how the information is
collected, should be relevant to all the people who have an interest in the
evaluation (P05).
• The creative arts can be used to involve people who are not very verbal or whose
English is limited (T5). These methods can be helpful where an evaluation
involves expression of feelings.
The costs of evaluation might mean that a sample is needed (that is, a smaller
number of people who are likely to represent the larger number). It may be
necessary to seek expert assistance to know how best to use sampling techniques.
Try the local university or research and development section of the statutory
services for assistance.
21
Ideas box 6
Different ways to find out whether participation makes a difference
•
•
•
•
•
•
•
•
•
•
•
•
Group interviews and discussions
Individual interviews and telephone conversations
Story telling
Committees and forums
Observation (of participation)
Questionnaires
Drawings/cartoons
Computer packages
Photos
The creative arts
Complaints
Suggestions box
The methods used to find out what difference joining in is has made should
encourage people to join in even more.
22
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
Big question 7: What tools and resources do we need?
What is going to help us to find out whether participation has made a
difference? How do we make sure it is meaningful?
Summary
Think about the resources that will be needed to make a successful evaluation of
service user and carer participation. For example, what kind of support and training
might people need to take part in the evaluation or to conduct it themselves?
Support might come best from joining together in groups or networks to ‘flex your
collective muscle’ (Branfield et al, 2006) whether this is networks of service users,
carers, workers, managers or policy makers, or perhaps networks between all of these
groups.
The evaluation might take place at different levels. What is needed for a one-off
event will differ from on-going evaluation. Finding out about how service users
participate in their own care plans will need different resources to finding out about
their participation in strategic planning.
Findings box 7
• There is no evidence in the available literature of any systematic attempt to
make the link between methods of evaluation and models of participation, such
as Arnstein’s ladder (R16) and Tresedar’s circular model (R06).
• Different stages of participation might lend themselves to different evaluative
approaches (T7).
• Differences in the size of the changes might suggest different kinds of evaluation.
Policy makers may be looking for the impact of participation at a strategic level
whilst service users may be more likely to focus on developing measures to
evaluate the impact on their day-to-day experience of services.
• Involvement at a higher strategic level is rare and so, presumably, is evaluation
of this level of participation (R04) (P01) (P04).
• How might people join in by joining together? Research suggests that service
user organisations and individual service users are often isolated and that the
development of service user networking is critical to making user involvement
work (Branfield et al, 2006).
• Some practice methods and planning approaches have evaluation built in to
their methodology, both short term results and longer term outcomes, though
it is rare to have a systematic analysis of the findings from these experiences
(Marsh and Doel, 2005; McLaughlin and Jordan, 1999).
23
Ideas box 7
What might be needed for a successful evaluation?
Skills
Physical
Personal
negotiating
equipment
confidence
interviewing
rooms
trust
facilitation
transport
commitment
communication
access
persistence
questionnaire design
strength
groupwork
honesty
sampling
analysis
writing
presentation
Training
For above skills
24
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
Big question 8: What about differences?
How will differences be handled? What if there are conflicts?
Summary
Service users and carers are not one group. Although they have some things in
common, there are also many differences. When you are finding what difference
participating has made, how will you take account of these different groups and
the different kinds of evaluation that might be needed? And how might differences
within groups be evaluated? What conflicts might there be during the evaluation?
There are power differences within groups and between them (for example, between
service users and professionals). Has participating made a difference to the power
relationships and how will you know if that is the case?
25
Findings box 8
• In one study, self-advocates reported that they felt strongly that the process
of change needed to slow down; so evaluative approaches with self-advocates
might need to proceed at a different pace (R02).
• Finding out about children’s involvement needs particular skills and approaches,
both with the children and any workers who are supporting them (R03) (R21)
(P06) (P07).
• Evaluating young people’s participation requires particular consideration (T11)
• Approaches to evaluation should take different cultures into account and make
sure that black and minority ethnic groups are fully involved (R17).
• Differences between generations need to be considered (R03).
• Issues for lesbian, gay and transgender people need to be considered.
• Measures to find out what difference participation is making must be careful
to include people who are seldom heard (R11). How can information be made
accessible to different sections of the community?
• How might different views and outlooks be included? For example, one study
found that service users identify higher levels of unmet need than service
providers (R12).
• There are many different kinds of organisation in social care: those led by service
users and carers; substitute and complementary services; statutory, voluntary or
independent, etc. How will these differences affect the evaluation? And are there
differences between new and established services? We were reminded by our
advisory groups that it is the day-to-day experience of services that counts (R17,
pii).
• The outcome of participation should not be fixed ahead of time. It is
unpredictable, so that there will be some results that were not expected, but
which may be welcome even so. How will surprises be handled (T4)?
• Do the funders have the same idea about what should be evaluated as the
people who use and provide the service?
26
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
Ideas box 8
• Which groups of people or individuals might you have to make extra efforts to
reach (Shaping Our Lives, 2007)?
• Why might there be barriers to this group of service users (for example, the
barriers might be carers or people who believe that service users are incapable of
speaking for themselves)?
• What other barriers might there be?
• How might it be possible to work with these barriers in order to reach the service
user?
• How could you help to dismantle barriers to participation for people who have
not been involved before?
27
Big question 9: What happens next?
How is the information from the evaluation collected and made sense
of? Let’s have some ‘for instances’ of changes. How will we get feedback?
Who owns these findings and what will happen as a result of them?
Summary
It is important to consider who is responsible for the evaluation, who decides what
will happen to the findings and how people are going to get to know about them. It
should be clear who will make sure that recommendations are acted on, so that the
evaluation makes a difference.
The results of the evaluation should be shared with the people who have been
taking part. With their permission, the findings can be shared with other networks
of services users and carers so they can learn from the experience, too. However,
people’s privacy needs to be respected.
Participation should not come to a finish with the evaluation, but can carry on in the
way that the findings are shared and acted on.
Findings box 9
• Findings need to be participative (T7), be presented creatively and in ways that
are relevant, interesting and visible to the audiences for whom they are intended
(P05) (P06).
• Feedback to people who have taken part is crucial to prevent cynicism (R02) and
to maintain interest (T2). People should be involved in deciding how feedback
will be provided (R06).
• Who has the responsibility to make sure that the findings from the evaluation
will be implemented? Who will make sure that you continue to find out the
difference participation of service users and carers is making?
• How might wider publicity be given to the findings, for example through
websites (P04), so that others can use your experiences of evaluation (R09)?
• What are the implications of the evaluation? Are there broader, political
implications?
• Research suggests that a properly resourced national user-led network could
support the networking that is crucial for positive participation (Branfield et al,
2006).
• A third of the initiatives in one review (R04, p3) were not providing any feedback.
28
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
Ideas box 9
How might evaluation needs and methods differ depending on where you are in
the cycle?
Where are we now?
Assess the present
situation
Did we make it?
Monitor
Where do we want
to go?
Goals
Go there!
Action
How will we get
there?
Methods and
strategies
How will we know
when we’ve made it?
Measures that
indicate progress
(Adapted from T4)
29
Section 3
Checklist of pointers
A quick checklist to help your evaluation of service user and carer participation
1. PURPOSE: Are you clear about the purpose of the evaluation? Why is service user
and carer participation being evaluated?
2. CHANGE: What kinds of change might you expect service user and carer
participation to have made and at what levels is it expected to make a difference
– individual experiences, staff attitudes, agency policies, local or national
strategies?
3. TIMING: When will you measure these changes? Are you looking for short term
results, longer term outcomes or both? Do you have indicators of progress?
4. PROCESS OF PARTICIPATION: How might the experience of participation be
evaluated?
5. SUPPORT and SUPPORTERS: What kinds of support might be needed to make
the evaluation an effective and independent one? What part might supporters
and facilitators play in evaluating the results of participation?
6. SKILLS: What skills are needed to make an evaluation of participation?
7. TRAINING: What kinds of training are needed to help people to evaluate the
effects of participation? Is this training available?
8. RESOURCES: What resources are needed to evaluate participation? Are resources
such as budget available (e.g. for payments to service users and carers involved
in evaluations) and, if not, how might they be found or creatively substituted?
9. ORGANISATIONAL CULTURE: How open to participation is the organisation or
group? Does the climate or culture in the organisation support participation and
how do you find out about this?
10. PRACTICE: How participative is practice in the organisation or group? How do
you evaluate the way service users and carers are involved in practice?
11. STRUCTURE: Is evaluation of participation a regular feature of the organisation
or group? Is it part of the structure? How might evaluation help it become part of
the structure?
30
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
12. POWER: What differences in power are there between the people involved
(service users, carers, professionals, managers, etc.)? How might these affect the
evaluation? What can you do to change these differences in power? How will you
involve people who are seldom heard?
13. TOKENISM: How will you avoid tokenism? In other words, how will you evaluate
whether the participation has been real and meaningful?
14. THOROUGH AND FAIR: How will you make sure that your evaluation listens to
the negative messages as well as the positive ones, taking note of disadvantages
of participation as well as advantages?
15. LINKING PARTICIPATION TO CHANGES: How might you find out whether any
changes are indeed a result of participation and not something else?
16. OWNERSHIP: How will service users and carers participate in deciding what will
be evaluated and how? Who will undertake the evaluation and how independent
should they be from the process? Who will own the information gathered? Are
there any other ethical issues that you will need to consider (for example, about
confidentiality)?
17. FEEDBACK: How do people find out about the results of the evaluation of service
user and carer participation?
18. IMPLEMENTATION: How are the findings from the evaluation to be used? Who
will implement recommendations? What further changes should you expect as a
result of the evaluation?
19. CONTINUITY: Is evaluation a one-off event or an on-going process and part of
the way the organisation or group works all the time?
20. PUBLICITY: How do other organisations and groups learn from your experience of
evaluating the difference that participation has made?
31
Numbered references
R01 Crawford, M. (2003) User Involvement in Change Management: A Review of the
Literature, SDO
R02 Carr, S. (2004) Has service user participation made a difference to social care
services? London: SCIE
R03 Coad, J. and Lewis, A. (2004) Engaging children and young people in research,
Literature review for the National Evaluation of the Children’s Fund (NECF),
London
R04 Franklin, A. and Sloper, P. (2004) Participation of disabled children and young
people in decision-making within social services departments in England, Social
Policy Research Unit, York
R05 Young, A. F. and Chesson, R. A. (2006) ‘Stakeholders’ views on measuring
outcomes for people with learning disabilities’, Health Social Care in the
Community 14 (1) 17–25(9)
R06 Wright, P., Turner, C., Clay, D. and Mills, H. (2006) Involving children and young
people in developing social care, London: SCIE Practice guide 6
R07 Cashmore, J., ‘Promoting the participation of children and young people in care,
Child Abuse and Neglect 2; 26(8)
R08 Crawford, M.J., Rutter. D., Manley, C., Weaver, T. and Bhui, K. (2002) ‘Systematic
review of involving patients in the planning and development of health care’,
British Medical Journal 325 (7375)
R09 Langton, H., Barnes, M., Haslehurst, S., Rimmer, J., Turton, P. and Langton, H.
(2003) Collaboration, user involvement and education: a systematic review of the
literature and report of an educational initiative, European Journal of Oncology
Nursing 7(4): 242–252
R10 Simpson, E. L., House, A. O. (2002) ‘Involving users in the delivery and
evaluation of mental health services: Systematic review’, British Medical Journal
325 (7375): 1265–1268
R11 Taylor. S. et al. (2005) Widening adult participation in learning: a systematic
review of strategies: research report, London: Learning and Skills Development
Agency
R12 Thornicroft, G. and Tansella, M. (2005) ‘Growing recognition of the importance
of service user involvement in mental health service planning and evaluation’,
Epidemiologia Psichiatria Sociale 14(1):1–3
32
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
R13 Trivedi, P. and Wykes, T. (2002) ‘From passive subjects to equal partners:
qualitative review of user involvement in research’, British Journal of Psychiatry
181:468–472
R14 Truman, C. (2005) ‘The autonomy of professionals and the involvement of
patients and families’, Current Opinion in Psychiatry 18/5:572–5
R15 Barnes, C. and Mercer, G. (2003), ‘Health/Research Review on User Involvement
in Promoting Change and Enhancing the Quality of Social ‘Care’ Services for
Disabled People’
R16 CCNAP (2006) Asking the Experts. A Guide to Involving People in Shaping Health
and Social Care Services, Community Care Needs Assessment Project (CCNAP)
R17 Janzon, K. and Law, S. (2003) Older People Influencing Social Care – Aspirations
And Realities. Research Review On User Involvement In Promoting Change And
Enhancing The Quality Of Social Care Services, London: SCIE
R18 Salmon, N. (2003) ‘Service Evaluation and the Service User: a Pluralistic
Solution’, The British Journal of Occupational Therapy 66(7)
R19 Williams, V. et al (2003), Has anything changed? Norah Fry Research Centre
R20 Bentley, J. (2003) ‘Older people as health service consumers: disempowered or
disinterested?’, British Journal of Community Nursing 8, 181–187
R21 Ackerman L., Feeny T., Hart J. and Newman J. (2003) Understanding and
evaluating children’s participation: A review of contemporary literature, Plan UK/
Plan International
R22 Kirby P. and Bryson, S. (2002) Measuring the magic? Evaluating and researching
young people’s participation in public decision making, Carnegie Young People
Initiative (see T11)
Toolkits and further references
Most toolkits, guides and models focus on participation rather than specifically on
evaluation. However, the examples below all include sections on evaluation, and the
tools they describe to facilitate participation are also useful for evaluation. They are
intended to cover many different kinds of evaluation in different settings.
T1
The Darlington Toolkit – Y. A. Harrison (2004) Patient Carer and Public
Involvement Staff Toolkit, Darlington Primary Care Trust
This is a toolkit for staff and is based on the requirement of Section 11 of the Health
and Social Care Act (2001) for PCTs (Primary Care Trusts) to progress patient and
public involvement in a systematic and coherent way. A section on Recruiting
participants suggests ways of reaching groups who are seldom heard. The aspects
33
specifically relating to evaluation are on pp51–2. The toolkit considers the way in
which results are presented and who they are presented to. It suggests a series of
questions that staff should consider and includes an example procedure for securing
patient and public presentation on committees, groups and panels. There is useful
further reading and websites. Ideas box 5 is adapted from this toolkit.
T2
Cathy Street and Barbara Herts (2005) Young Minds, Good Practice: Putting
Participation into Practice, www.youngminds.org.uk
A guide for practitioners working in services to promote the well-being of children
and young people. The central focus is on participation. Of most relevance to the
question of evaluation is a sub-section on involving young people in research and
evaluation. This usefully reminds us of the ethical issues involved in evaluating.
The authors note that ‘there has been a general move away from large scale
survey approaches towards methods that more actively involve service users in
research about mental health services’. The guide includes case studies and tools
for developing participation. Stage 3 of this process focuses on feedback, in which a
strong message is the need to ensure that participation and evaluation are part of
the planning cycle and not just an ‘add-on’. The guide looks at the advantages and
disadvantages of different methods of gathering information from young people,
families and carers. There is a comprehensive list of references.
T3
Waltham Forest Council (2004) Public Consultation Toolkit
This focuses on consultation as one aspect of participation. A checklist on
consultation (p16) includes ‘Make it clear who will manage the process and ensure
that contact details are available’ and ‘Coordinate the consultation with any
others taking place at the same time or covering similar topics or sections of the
community’. The evaluation of the consultation exercise is available at: www.lbwf.
gov.uk/comp-pub-cons.pdf
T4
Department of Public Health (2000) Improving Health Services Through
Consumer Participation: a resource guide for organisations, Flinders University
and South Australian Community Health Research Unit
Looking outside the UK context, this provides a useful list of toolkits in the Australian
context. Again, it is primarily concerned with participation per se, but Section 5
(pp 111–4) concerns evaluation. The notion of a participation cycle has been
incorporated into Ideas box 9. The authors introduce the idea of evaluation as
dialogue, so that service users and carers can come to the table to talk about what
is of merit, value, worth or significance to them. This toolkit has a useful Evaluation
checklist (p113) and an alphabetical ‘Frequently asked questions’ section (pp115–127).
T5
34
Lewisham Primary Care Trust (2003) A guide to involving public, patients, users
and carers in developing Lewisham Primary Care Trust, written by Marion Gibbon
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
Although this focuses more on getting people to participate, rather than evaluating
the effect of the participation, this is a useful guide to participation at a community
level. See: www.lewishampct.nhs.uk
T6
Appreciative Inquiry (AI) (http://www.aipractitioner.com)
Not so much a toolkit as a methodology for exploring the strengths in organisations.
It takes its evaluation from this strengths perspective.
As the term implies, Appreciative Inquiry focuses on appreciating and then giving
leverage to an organisation’s core strengths, rather than seeking to overcome or
minimise its weaknesses. It focuses on exploration and discovery of moments of
excellence in the organisation through deep inquiry, and an openness to seeing new
potentials and possibilities from that collective knowledge. ‘Organizations grow in
the direction of what they repeatedly ask questions about and focus their attention
on.’ Most of us have grown up in organisations that were comfortable (some addicted
to!) identifying and analysing problems. AI suggests that there is another, more
powerful model for organisational change, that treats organisations as mysteries
to be embraced rather than problems to be solved. This alone is a powerful shift in
thinking.
For more information contact: Julie Barnes
[email protected] or lesley@
mooreinsight.co.uk
T7
McGinn, P., (2006) Assessing the Impact of Service User Participation in the
Southern Area, Lurgan: Southern Health and Social Services Council, available
from: www.shsscouncil.net/pdfs/Assessing%20Service%20User%20Book.pdf
This is more of a review than a toolkit, but it is a useful example of an evaluation
of service user participation in terms of its impact. The review explains the terms of
reference for the review, the methodology used, includes a literature review and an
overview of the policy context. The process and impact of service user participation
are illustrated with case examples. A user-centred model is presented to promote
participative practices (this has been adapted in Ideas box 4), with specific examples
of evidence from practice. The review concludes with some helpful guidelines.
T8
Hutchinson, R. and Stead, K. The Rickter Scale ®
[email protected]
This tool is used by one of the example Practice sites (P01). It has been created
for practitioners by practitioners, with a minimum of recording documentation.
It measures ‘distance travelled’: the ‘soft’ outcomes that people achieve, such as
dealing with barriers to employment, training or education, by overcoming limiting
beliefs, and gaining confidence and self-esteem. The Rickter Scale® is essentially a
colourful plastic board with sliders on scales that read from 0 to 10. It is non-paper
based, with the specific intention of providing an experience that appeals to different
senses and learning styles.
35
It is a participative tool which can be used to evaluate participation. Ready-made
questions are available for a number of different situations (available on the website);
these could be adapted to measure the impact of participation.
The Rickter Scale® helps people to make informed choices and set goals which are
realistic and achievable, and to take responsibility for their own action plan and
determine the level of support they require. It is designed to enable people to take up
new perspectives which reflect their capabilities, beliefs, values and sense of identity.
The significance of people keeping their fingers in contact with the Rickter Scale®
during the questioning is related to what is known in neuro-linguistic programming
as ‘anchoring’. At the second or subsequent use of the Rickter Scale®, comparison is
made with this first ‘profile’ and thus ‘distance travelled’ is measured.
T9
Gwynneth Strong and Yvonne Hedges and Welsh translation by Emyr Huws
Jones (2000) Too Many Pages: SCOVO’s Guide to Involving Service Users to Make
Better Services
This accessible guide is written in English and Welsh is dotted with cartoons and
illustrative quotations (‘if there’s too many pages, I can’t be bothered, so I put it in
the bin’). Its main audience is the learning disability sector, but it is relevant to other
sectors, too. Like the other guides and toolkits we have mentioned, its main focus is
on participation and involvement, but there are also useful ideas about feedback and
evaluation.
The guide squares up to issues such as representation, for example ‘nobody ever asks
the paid workers if their views are ‘representative’. Can you imagine asking in the
middle of a meeting “Are you sure Mr/Mrs Social Worker that the views you are giving
are representative of every employee in social services?”’ Ideas box 2 is adapted from
this toolkit.
T10 W. K. Kellogg Foundation (2004), Using Logic Models to Bring Together Planning,
Evaluation, & Action: Logic Model Development Guide (www.wkkf.org/Pubs/
Tools/Evaluation/Pub3669.pdf)
Like AI (T6), Logic models are not so much a toolkit as an aid to planning and
evaluation. They set out the logical relationships between needs, goals, services and
outcomes. They provide a structure for understanding the process of change, whether
for projects or for individual work. Typically, a logic model will focus on the problem
and what is wanted as the end result, identifying that as the goal, but noting also
a series of mini-goals or milestones towards achieving the goal. Indicators that will
show whether each mini-goal has been achieved need to be made specific – in other
words ‘how will we know if change has occurred?’ Logic models measure results (the
progress at the end of the project or piece of work) and they also measure outcomes
(the progress at a later stage which shows whether the effects have been sustained).
There is increasing experience of using logic models at individual and service level.
36
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
A closely linked method is task-centred practice, which is a highly participative
method of practice in social work and social care (Marsh and Doel, 2006).
T11 Perpetua Kirby with Sara Bryson Measuring the Magic: Evaluating children’s and
young people’s participation in public decision making
London: Carnegie Young People Initiative ISBN 0-900259-47-7 (www.
carnegieuktrust.org.uk/files/main/Measure%20the%20Magic.pdf)
This publication reviews approaches to participation of young people in decision
making and provides guidance on the best practice to follow. The publication details
the problems and pitfalls in processes used in participation and provides a range of
useful sources of materials (see R22).
T12 The Evaluator’s Cookbook (2005) National Evaluation of the Children’s Fund
(www.ne-cf.org)
This resource contains 25 ‘recipes’ for participatory evaluation exercises for use with
children and young people. Interactive versions of a number of these exercises are
available on the NECF website. The Cookbook now includes ‘templates’ for a number
of the exercises.
The Cookbook is designed as a resource for anyone working with children to use
– either as part of an overall evaluation of children’s services/funds or as a tool
for evaluating particular activities (play schemes, single sessions, etc.). Although
primarily a resource for those working with 5–13-year-olds, all the exercises have also
been piloted for use with adults.
37
Arnstein’s Ladder of Participation
Although this ladder is about participation rather than evaluation, and it was first
developed in 1969, it is still frequently referred to and so we reproduce it here.
8
Citizen control
7
Delegated power
6
Partnership
5
Placation
4
Consultation
3
Informing
2
Therapy
1
Manipulation
Degrees of
Citizen Power
Degrees of
Tokenism
Non
Participation
Other website-based toolkits include:
Worcestershire County Council:
http://worcestershire.whub.org.uk/home/wccindex/wcc-con/wcc-con-toolkit.htm
Bournemouth Teaching Primary Care Trust:
http://www.bournemouth-pct.nhs.uk/involvement/toolkit.htm
Other references
Baum, F., Cooke, R. and Murray, C. (1998) (see T4 references)
Branfield, F. and Beresford, P. with Eamon J. Andrews, Patricia Chambers, Patsy
Staddon, Grace Wise and Bob Williams-Findlay (2006), ‘Making user involvement
work: supporting service user networking and knowledge’, Joseph Rowntree
Foundation (download free from www.jrf.org.uk).
Marsh, P. and Doel, M. (2005) The Task-Centred Book, London: Routledge/Community
Care
38
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
McLaughlin, J. A. and Jordan, G.B. (1999) ‘Logic models: a tool for telling your
program’s performance story’ in Evaluation and Program Planning 22, pp65–72
Shaping Our Lives (2007) ‘Beyond the usual suspects: developing diversity in
involvement’ in Shaping Our Lives newsletter 10
Telford, R., Boote, J. D. and Cooper, C. L. (2004) What does it mean to involve
consumers successfully in NHS research? A consensus study’, Health Expectations:
An International Journal of Public Participation in Health Care and Health Policy 7, 209–
220
Additional references: difference and diversity
Barnes, M., Davis, A., and Rogers, H. (2006) ‘Women’s voices, Women’s choices:
Experiences and creativity in consulting women users of mental health services’,
Journal of Mental Health 15, 329–341
Baxter, L., Thorne, L., and Mitchell, A. (2001) Small Voices, Big Noises. Lay involvement
in health research: lessons from other fields
Bennion, C. A. (2003) A User Involvement Strategy for Action for Blind People
Caron-Flinterman, J. F., Broerse, J. E., and Bunders, J. F. (2005) ‘The experiential
knowledge of patients: a new resource for biomedical research?’ Soc Sci Med, 60,
2575–2584
Dewar, B. J. (2005) ‘Beyond tokenistic involvement of older people in research – a
framework for future development and understanding’, Journal of Clinical Nursing, 14
Suppl 1, 48–53
Evans, R. and Banton, M. (2001) Joseph Rowntree Foundation Findings: Involving black
disabled people in shaping services, York: Joseph Rowntree Foundation
Faulkner, A. and Morris, B. (2003) Expert paper: User Involvement in Forensic Mental
Health Research and Development, NHS National Programme on Forensic Mental
Health Research and Development
Griffiths, K. M., Christensen, H., Barney, L., Jenkins, A., Kelly, C., and Pullen, K. (2006)
Promoting consumer participation in mental health research: A National Workshop,
Depression & Anxiety Consumer Research Unit, Centre for Mental Health Research,
The Australian National University
Howe, A., MacDonald, H., Barrett, B., and Little, B. (2006) ‘Ensuring public and
patient participation in research: a case study in infrastructure development in
one UK Research and Development consortium’, Primary Health Care Research and
Development, 7, 60–67
39
Janzen, R., Nelson, G., Trainor, J., and Ochocka, J. (2006) ‘A Longitudinal study of
mental health consumer/survivor initiatives: Part 4 – Benefits beyond the self? A
quantitative and qualitative study of system-level activities and impacts’, Journal of
Community Psychology, 34, 285–303
Minogue, V., Boness, J., Brown, A., and Girdlestone, J. (2003) Making a Difference.
The Impact of Service User and Carer Involvement in Research, South West Yorkshire
Mental Health NHS Trust
Minogue, V., Boness, J., Brown, A., and Girdlestone, J. (2005) ‘The impact of service
user involvement in research’, International Journal of Health Care Quality Assurance,
18, 103–112
Mullender, A. and Hague, G. (2005) ‘Giving a Voice to Women Survivors of Domestic
Violence through Recognition as a Service User Group’, British Journal of Social Work,
35, 1321–1341
Nelson, G., Ochocka, J., Janzen, R., and Trainor, J. (2006) ‘A longitudinal study of
mental health consumer/survivor initiatives: Part 1 – Literature review and overview
of the study’, Journal of Community Psychology, 261–272
Nelson, G., Ochocka, J., Janzen, R., and Trainor, J. (2006) ‘A longitudinal study of
mental health consumer/survivor initiatives: Part 2 – A quantitative study of impacts
of participation on new members’, Journal of Community Psychology, 34, 261–272
Ochocka, J., Nelson, G., Janzen, R., and Trainor, J. (2006) ‘A longitudinal study of
mental health consumer/survivor initiatives: Part 3 – A qualitative study of impacts of
participation on new members’, Journal of Community Psychology, 34, 273–283
Oliver, S., Clarke-Jones, L., Rees, R., Milne, R., Buchanan, P., Gabbay, J. et al (2004)
‘Involving consumers in research and development agenda setting for the NHS:
developing an evidence-based approach’, Health Technology Assessment (Winchester,
England), 8, I–IV
Oliver, S., Milne, R., Bradburn, J. D., Buchanan, P., Kerridge, L., Walley, T. et al (2001)
‘Investigating Consumer Perspectives on Evaluating Health Technologies’, Evaluation,
7, 468–486
Robson, P., Begum, N., and Lock, M. (2003) Joseph Rowntree Foundation Findings:
Increasing user involvement in voluntary organizations, Y: Joseph Rowntree Foundation
Roulstone, A., Hudson, V., Kearney, J., and Martin, A. (2006) Working together:
Carer participation in England, Wales and Northern Ireland (Rep. No. Stakeholder
Participation Position Paper 05), London: SCIE
Simpson, E. (2002) A guide to involving users, ex-users and carers in mental health
service planning, delivery or research: a health technology approach, University of
Leeds
40
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
Social Policy Research Unit, U. o. Y. (2004) Participation of disabled children and young
people in decision-making within social services departments in England (Rep. No.
2004–02), Research Findings, SPRU
Steel, R. (2005) ‘Actively involving marginalized and vulnerable people in research’ in
L. Lowes and I. Hulatt (Eds.) Involving Service Users in Health and Social Care Research
(pp18–29), London: Routledge
Truman, C. and Raine, P. (2002) ‘Experience and meaning of user involvement: some
explorations from a community mental health project’, Health & Social Care in the
Community, 10, 136–143
41
Practice sites
Ten ‘practice sites’ were included in the research as illustrations of evaluation of
the difference participation has made. All of these are included below, with detailed
accounts from nine of them.
P01 NIUSE SEA project (Northern Ireland Union of Supported
Employment, Supported Employment in Action project)
1 Characteristics of service users (‘beneficiaries’) involved
Service users are called beneficiaries in the project and they are disabled people with
physical, learning, sensory or hidden disabilities, or a combination of these.
2 How service user participation within the project is ensured
The SEA focus group consists of 13 service users who have accessed employmentfocused services across Northern Ireland. The project aims to strategically review
services for disabled people – more specifically the gap in employment service
provision. Participation in the project gives disabled people access to policy review
through focus group and conference activities run by disabled individuals.
3 What policies on service user participation has the project formulated?
The beneficiary focus group has contributed to the review of employment-focused
services in Northern Ireland. They will have valid input into key policies effecting
disabled people accessing, maintaining and retaining paid employment. Through the
research methodology it is hoped the project will impact or influence future policies
related to service user participation.
4 How are service users (beneficiaries) supported?
Service users are supported with transport, accessibility, follow-up contact, member
organisations and consent issues.
5 How are the effects of participation monitored, audited, and evaluated? Who
carries out the evaluation?
The evaluation of participation is taking place from individual to policy levels using
focus groups, life histories, case studies and photography projects. The Rickter
motivational assessment tool has been deployed in a group format to further the
evaluation model used (this measures distance travelled) (T8).
An external evaluator is involved and the final evaluation is expected in summer
2007. Beneficiaries (service users) are involved as participants in the research.
42
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
6 A particular example of participation making a difference
Disabled participants now have the capacity to form opinions on their individual lived
experience, identify their difficulties and relate this to structural barriers.
At a strategic level, this group using the information they have produced on structural
barriers now engage with senior government personnel and other key stakeholders
making recommendations for change.
This project is still on-going and the impact of service user participation is still to be
determined and will continue beyond the completion of the project in December
2007.
7 Contact details
Contact person(s): Edyth Dunlop, Regional Manager; Lorraine Boyd, Project
Facilitator
Address: Northern Ireland Union of Supported Employment,
SEA Project
NICVA Complex
61 Duncairn Gardens
Belfast
BT15 2GB
Telephone: ++44 (0) 28 9087 5014
Textphone: ++44 (0) 28 9087 5014
Fax: ++44 (0) 28 9087 5008
Email:
[email protected]
Web address: www.niuse.org.uk/seaproject
P02 The Skillnet Group
1 Characteristics of people involved
The Skillnet Group is an organisation where learning-disabled people and nondisabled people work together equally.
2 How participation is ensured
Self-advocates have control in all aspects of the Skillnet Group. Participation is about
‘voice’, ‘choice’ and ‘change’. Self-advocates’ participation includes:
•
•
•
•
team and training days (each project or service)
planning meetings (new projects, services or ideas)
communication days (all employees – includes learning-disabled people)
internal reviews
43
•
•
•
•
•
•
Speaking Up Network
a board of trustees
funding bids
recruitment
policies, procedures and risk assessments – everyone involved
website – message board, suggestions/new ideas area, emails.
3 What policies on participation has the organisation formulated?
Participation began at the beginning, with the attempts to seek funding for the
project. The Skillnet Group has developed Steering groups and a ‘Decision team’
to embed participation throughout the organisation; these groups have members
who are learning disabled and members who are not. A website is used to feedback
findings as well as feedback forms. Each project/service produces its own action plan
every year – everyone works on these together. These feed into the overall three-year
plan (monitored by the Decision team) which is composed of about 20 people, ten of
whom are self-advocates.
4 How are people supported?
The Skillnet Group directly supports about 150 self-advocates and 300 others
indirectly to become more independent in all parts of life, including; learning, work,
housing, health, money, transport, leisure and relationships. People are supported
to develop their own person-centred plans and to make sure their plans are put into
action and taken seriously.
5 How are the effects of participation monitored, audited, and evaluated?
In addition to the activities already mentioned, two evaluations have been conducted
by self-advocates from another project, the most recent was completed in December
2006. Self-advocates were part of the evaluation team.
6 A particular example of making a difference
Self-advocates have always made the decisions about staff recruitment and this has
been a significant step.
7 Contact details
Contact person: Jo Kidd (Director)
Address: Second Floor, Maybrook House, Queen’s Gardens, Dover, CT17 9AH
Telephone: 07968 105862
Email:
[email protected]
Web address: www.skillnetgroup.co.uk
44
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
P03 People First Rhondda Cynon Taff (RCT)
1 Characteristics of service users involved
People First is a service-user led organisation for self-advocates (people with learning
disabilities). There are about 200 members.
2 How self-advocates’ participation within the organisation is ensured
There are meetings of self-advocates once a month in each of three counties. An
over-arching Executive Committee includes self-advocates. A participation guideline
has been developed to guide local statutory services.
3 What policies on self-advocates’ participation has the organisation
formulated?
There is participation at all levels from individuals to a three-county strategy. Selfadvocates receive training to participate in interview panels. There has been a recent
bid, jointly with a local university, to undertake research, in which self-advocates will
participate as co-researchers.
4 How are self-advocates supported?
The self-advocates support each other. Staff offer support on a group basis.
5 How are the effects of participation monitored, audited, and evaluated?
Evaluations are undertaken as part of the terms of the funding for various projects
within People First. An independent evaluation was conducted in 2004 and another is
due, though it is not specifically evaluating participation. The AGM and annual report
are important for public audit of the project and the effects of participation. There are
occasional questionnaire surveys. However, self-advocates generally say they prefer
group discussions to writing things down.
There are tracking forms to see whether all parts of the learning disabled community
are participating – to understand whether some of the ‘seldom heard’ groups are
being included, too.
6 A particular example of participation making a difference
Training around participating in interviews has been important and has made selfadvocates’ participation more meaningful. RCT train people with learning disabilities
to become co-tutors which enables them to deliver disability equality awareness
training.
45
7 Contact details
Contact person: Emma Alcock (Project Coordinator)
Address: Old Bank Buildings
The Square
Porth RCT
CF39 9NP
Wales
Telephone: 01443 683037
Email: rctpeoplefi
[email protected]
Web address: rctpeoplefirst.ik.com
P04 Sandwell Visually Impaired Group (SVI)
1 Characteristics of service users involved
SVI serves adults with visual impairments in the Sandwell area of the West Midlands.
It is a service user-led partnership with the local authority. Currently (March 2007)
there are 200 members of SVI, which is 12% of all visually impaired people in the
borough.
2 How service user participation within the organisation is ensured
The purpose of participation is enhancing the quality of lives of the visually impaired
people living or working in Sandwell and to help members to become self-advocates.
SVI’s mission is the full implementation of the RNIB 16 standards as laid out in
‘Progress In Sight’, which the council have also pledged themselves to implement.
There is participation on council committees and service provider teams (Sensory
Impairment) as well as job interviews for professionals.
3 What policies on service user participation has the organisation formulated?
Participation takes place at all levels except research. At the policy level, the council
have undertaken to develop a Sensory Impairment Booklet and SVI are engaged in
this process. SVI members have joined two council Scrutiny Panels (Equality and
Diversity; and Culture and Community) and sit on the Disabled Equality Project
Board.
4 How are service users supported?
Users of Sandwell’s Vision Services are supported by the endeavours of the SVI
Management team to secure the delivery of an improved service. To this end SVI
have established two strategic sub-groups. Both groups meet frequently and not less
than monthly. As community peers, with expertise, SVI are advocates for the less
confident.
46
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
Social support is provided by SVI publishing a quarterly newsletter that is distributed
to all 1,700 named on Sandwell’s register of visually impaired people; the launch of an
interactive website; and the holding of quarterly entertaining ‘Open meetings’. The
well-attended AGM (110 people in 2006), also provides another friendly forum for
the community.
5 How are the effects of participation monitored, audited, and evaluated?
Internal evaluations are undertaken, but there is no independent evaluation.
Evaluations are not specifically about the effects of participation, but focus on the
RNIB’s ‘Progress In Sight’, which is regarded as the benchmark. Membership figures
are seen as a good indication of the success of participation (and these are increasing
dramatically). A regular newsletter and the SVI website and AGM are used to help
monitor participation.
6 A particular example of participation making a difference
Four of SVI’s committee members now have input into job applications. Initially this
was confined to posts within Vision Services this is now broadening to embrace the
wider council service.
7 Contact details
Contact person: Graham Price (Chair)
Address: PO Box13235 West Bromwich, B70 1BE
Telephone: 0121 565 2875
Email:
[email protected] or
[email protected]
Web address: www.sandwellvisuallyimpaired.org.uk
P05 CHILYPEP (Sheffield Children and Young People’s Empowerment
Project)
1 Characteristics of service users involved
The Sheffield Children and Young People’s Empowerment Project (CHILYPEP) is a
registered charity that has worked with children and young people across Sheffield
for over five years. The purpose of the work is to support children and young people
in communities with high deprivation indicators to be actively involved in decisionmaking processes that affect their lives.
2 How service user participation within the organisation is ensured
The project is designed as a peer research project. Participants develop
questionnaires, which are analysed by staff but with reports written in consultation
with young people. Feedback is provided through area conferences to key public
services.
47
3 What policies on service user participation has the organisation formulated?
The organisation’s purpose is to develop participative consultation with children and
young people in key areas of Sheffield and to promote and facilitate the application
of findings of consultation exercises with public services.
4 How are service users supported?
A paid worker coordinates the peer research projects and supports the young people
to develop neighbourhood youth forums to take forward issues raised in the research
findings. A professional researcher ensures that the design and analysis of data
collection uses the themes and issues suggested by young people.
5 How are the effects of participation monitored, audited, and evaluated?
The process is self-auditing with clearly defined outcomes to each episode of
participatory work.
6 A particular example of participation making a difference
A Peer Research Project was undertaken to find out the views and ideas of young
people. This piece of research was also assigned to inform the Youth Strategy for this
area. The Community Forum will be responsible for the implementation and review
of the strategy, with support from partners within the Youth Strategy Group. The
Delivery Plan will cover a 12-month period, and will be reviewed and updated through
quarterly meetings. An evaluation event is planned (March 2007), which will work to
inform how this project is developed further. The project was also aimed at supporting
young people to develop new skills in research and consultation, and to gain a wider
understanding of how to influence decision making and have their voice heard.
7 Contact details
Contact person: Karen Hill (researcher)
Address: Children and Young People’s Empowerment Project
Remington Youth and Community Centre
200 Remington Road
Parson Cross, Sheffield S5 9AG
Telephone: 0114 2463897
Email:
[email protected]
Web address: http://www.chilypep.org.uk/ (website under construction)
48
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
P06 Cornwall Young Carers’ Project
1 Characteristics of service users involved
Service users are young people who have a caring role for a parent or grandparent
who suffers from a long term illness, disability or have a dependency on alcohol or
drugs.
2 How service user participation within the organisation is ensured
The project works to ensure that the young people who have a caring role have
quality time to enjoy leisure activities, that schools are sensitive to the issues that
face them, and all service providers are aware of problems that occur for these young
people on a daily basis. Young people are involved in decisions about individual and
group work within the project. They have also been closely involved in the design and
development of project evaluations.
3 What policies on service user participation has the organisation formulated?
The project operates from a participative framework in which young people are
encouraged and enabled to take part in shaping the direction of the project as a
whole and the direction of individual work they may be involved with.
4 How are service users supported?
Two project workers and an administrator are funded to work on the project. They
help with advice and information, practical support, liaison with professionals,
personal support, outings and leisure activities and arranging time off.
5 How are the effects of participation monitored, audited, and evaluated?
Evaluation has been a major focus of the development of the project. There have
been two major independent evaluations, the second of which built on the findings
of the first. Findings from the first evaluation pointed to the need for development of
administrative systems to support further evaluative work. Young people are closely
involved in the design, delivery and dissemination of the evaluation. Evaluation uses
an action research approach but recognises that this does not easily capture the
cumulative benefits of the project.
6 A particular example of participation making a difference
The emerging evaluation strategy has been able to demonstrate the need for an
on-going action research approach to the project while also recognising that impact
is cumulative for many young people. Many of the young people have presented at
conferences locally and nationally about the Project’s success.
49
7 Contact details
Contact person: Dawn Maddern
Address: 14 Chapel Street
Camborne
Cornwall
TR14 8ED
Telephone: 01209 614955
Email:
[email protected]
Web address: www.cornwallyoungcarers.co.uk/intro.htm
P07 The Visiting Advocacy Service in Secure Children’s Homes
(VOICE)
1 Characteristics of service users involved
Young people in secure children’s homes.
2 How service user participation within the organisation is ensured
VOICE (formerly Voice of the Child in Care) operate a visiting advocacy service
within a number of secure establishments to provide independent support to
young people who wish to raise concerns about their care or make representations.
Advocates visit on a regular basis either weekly or fortnightly and become familiar
and trusted persons, seen as independent of both the secure units and social services
departments. A key feature of their involvement with the young people is a high
threshold of confidentiality.
3 What policies on service user participation has the organisation formulated?
It is important to distinguish between the policies in the secure establishments and
those of VOICE itself. VOICE has an overall ‘blueprint for activity’ which aims to:
• focus on the child in everything we do
• promote good relationships with family, friends and professionals
• recognise that children and young people are competent and have the capability to
work in partnership with adults
• argue that the bureaucratic processes that have become associated with the care
system have to be minimised and adapted, if we are to serve children as individuals,
and promote their sense of identity.
4 How are service users supported?
The secure advocacy project provides advocacy visitors for young people who are in
secure accommodation, supporting interventions on their behalf.
50
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
5 How are the effects of participation monitored, audited, and evaluated?
The service has developed an exit interview as an approach to evaluating the
advocacy service, as well as reviewing the young person’s experience in the unit.
In theory all young people are able to contribute to the exit evaluation, although
unforeseen moves or releases from custody at court appearances mean that not all
young people participate. The exit interviews operate in three secure units at present
and developments in other locations are underway.
6 A particular example of participation making a difference
Representations to the visiting advocacy service resulted in a substantial change in
policies regarding telephone contact in one establishment. The managers saw the
process primarily as one part of the unit’s involvement of young people in the service
they receive.
7 Contact details
Contact person: Ian Storr
Address: Suite G15
Redlands Business Centre
3/5 Tapton House Road
Sheffield S10 5BY
Telephone: 0114 266719
Email:
[email protected]
Web address: www.voiceuk.org.uk
P08 Mersey Care Trust
1 Characteristics of service users involved
All users were mental health service users and carers. The evaluation asked why
service users wanted to become involved in services. The most common reasons were
to ‘change and improve services’, ‘to give something back’ and ‘to have something
meaningful to do’.
2 How service user participation is ensured
Activities and methods that ensured participation described by service users included:
•
•
•
•
•
recruitment and selection of staff
open space events
provision of information
involvement in a user forum
positive achievement awards.
51
3 What policies on service user participation have been formulated?
The Trust has a strategy which states the importance of user involvement/
participation. In the survey of staff members 58 out of 73 said they were personally
responsible in involving services users.
4 How are the effects of participation monitored, audited, and evaluated? Who
carries out the evaluation?
The evaluation of participation was undertaken in the form of a survey of 201 people
who had participated in services and who were held on a user participation database.
5 A particular example of participation making a difference
Both service users and staff were asked in the survey how participation made a
difference.
Service users: 44 out of 93 said it made a lot of difference to their lives, 34 out of 93
said it made some difference, and 8 out of 93 said it made no difference.
Practitioners: 32 out of 73 a lot of difference, 34 out of 73 some difference, and 6
out of 73 said it made no difference.
6 Contact details
The evaluation report for Mersey Care Trust can be found at: www.
merseycare.nhs.uk/Library/service_user_and_carer_involve/Involving_S_&_C/
SUCReportColourUpdate2.pdf
P09 The TRUE Project
1 Characteristics of service users involved
This case study is the evaluation of the experience of user participation as coresearchers on a research project. The evaluation report is called ‘A Story of Colliding
Worlds’.
The TRUE (Training in Research for Service Users Evaluation) Project was undertaken
to scope training provision in the UK relevant to consumer involvement in research:
to identify what elements are effective, and to develop a good practice guide.
The co-researchers in the TRUE Project were adults who were mental health service
users. All user-researchers were members of CAPITAL (Clients and Professionals in
Training and Learning), a mental health service user organisation.
52
Participation: finding out what difference it makes
STAKEHOLDER PARTICIPATION
2 How service user participation within the organisation is ensured
The project was a three-way collaboration between Worthing & Southlands Hospitals
NHS Trust, CAPITAL and the University of Brighton.
3 What policies on service user participation has the organisation formulated?
The evaluation focussed on user involvement in research.
4 How are service users supported?
The TRUE team comprised seven service users (members of CAPITAL, a mental health
service user organisation), three project supervisors and a project coordinator. The
three supervisors represented the three organisations involved in the collaboration:
one researcher from Worthing & Southlands Hospitals NHS Trust, one researcher
from the University of Brighton, and the third being the Director of CAPITAL.
5 How are the effects of participation monitored, audited, and evaluated?
The ‘Colliding worlds’ evaluation took place at the end of the TRUE Project. The
TRUE research team requested the opportunity to reflect on their experiences
of being involved in the project. This evaluation explored the experience of user
participation as co-researchers and the purpose of the evaluation was to improve
good participation of service users in research.
A one-day event was arranged at the end of the project to bring everyone together in
order to reflect upon the project. Five service users and three researchers attended.
This was facilitated by an independent researcher.
A second half-day was organised for those people who wished to take part in a
further opportunity to share their views in more depth. Four people (one researcher
and three service users) attended and one further service user submitted a response
in writing.
6 A particular example of participation making a difference
The TRUE team produced an action pack that has been tried and tested. This pack was
downloaded 9342 times in the first year.
7 Contact details
The TRUE project can be found at: invo.org.uk/pdfs/TRUE%20final%20report130404.
pdf
Capturing the experiences of those involved in the TRUE Project: A Story of Colliding
Worlds can be found at: www.invo.org.uk/pdfs/colliding%20worlds.pdf
53
P10 London Primary Care Studies programme
This was a commissioned evaluation of the impact of service user involvement in
research across a wide range of settings within primary care. Service users were
involved in interview design, revising questionnaires, finding new ways to collect
data and increasing number of participants. They contributed to interpretation of
data and to dissemination of findings through their own networks. Service users and
carers changed services based on the research findings, and measured the impact
of those changes. There was a direct relationship between the level of engagement
and positive feelings about it. Service users and carers who felt more remote from
the senior researchers were more likely to report a mixture of positive and negative
experiences of their participation. Methods used included regular telephone contact
and easy to understand language. There was a need for respect for service user
knowledge and insights and a strong commitment from everyone to use involvement
to improve research and service delivery. In projects not achieving this level of
partnership, participants reported the use of jargon by researchers and clinicians. The
report provides guidance on best practice in service user involvement in evaluation.
Contact details
Contact person: Nicky Britten
Address: Peninsula Medical School St Luke’s Campus Exeter EX1 2LU
Telephone: 01392 264859
Email:
[email protected] or
[email protected]
Website: http://www.invo.org.uk/pdfs/Summary_of_PC11Report1.pdf
54
Participation: finding out what difference it makes
RG07
SEPTEMBER 2007
Participation: finding out what
difference it makes
The Social Care Institute for Excellence has
commissioned this guide to help users, carers
and professionals design evaluations of service
user and carer participation. The principles
of service user and carer participation is well
established, by evaluating this we will be able
to see what impact this involvement is having
on social care services.
This publication is available in an
alternative format upon request.
Social Care Institute for Excellence
Goldings House
2 Hay’s Lane
London SE1 2HB
tel 020 7089 6840
fax 020 7089 6841
textphone 020 7089 6893
www.scie.org.uk
Registered charity no. 1092778
Company registration no. 4289790