Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
892
views
0
recommends
+1 Recommend
2 collections
    0
    shares
      scite_
       
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Going DEEP: an evaluation of a social pedagogy-informed approach to evidence-enriched practice in social care

      Published
      research-article
      Bookmark

            Abstract

            Social care workers benefit from multiple types of evidence to enhance citizen well-being, support their own well-being and improve social care services. Building capacity within social care to find, collect and use different forms of evidence is an international concern. The Developing Evidence Enriched Practice (DEEP) programme in Wales is informed by the values and aims of social pedagogy. It aspires to enhance both the generation and use of evidence in social care. To learn about what works in the programme, we conducted an evaluation based on contribution analysis that explored programme impacts between 2020 and 2023. Based on a co-produced theory of change the evaluation drew on exemplar cases, questionnaire responses, documentary evidence, process data and unsolicited feedback. There was evidence that the DEEP programme contributed to people better valuing and gaining a better understanding of different forms of evidence. Citizen voice could become more central in decision-making, and there were examples of practice, policy and research being informed by diverse evidence. Many people who attended the DEEP learning course enhanced their confidence and skills by using the DEEP approach and said that they would put their learning into practice. It was harder to evidence longer-term impacts and the sustainability of the approach. These findings suggest that there can be merit in developing capacity-building programmes informed by social pedagogy. Such programmes can be characterised as relational, holistic, practice-focused, multifaceted, contextualised and co-produced with intended beneficiaries.

            Main article text

            Background

            Social pedagogy is values-led. It seeks to create environments and practices that support well-being and inclusion as well as learning. Social pedagogy is a growing approach to service and staff development in the UK, particularly in social care (Kirkwood et al., 2019; Petrie, 2013; ThemPra, 2015a). This makes it important to evaluate and reflect on the contribution that programmes informed by social pedagogy make.

            The need for cultures to be supportive of holistic learning within social care is clear. We recognise that there is no universally agreed definition of social care (Cameron et al., 2021) and we use the term to refer to organisations that provide non-medical care and support to people to live a ‘life on equal terms’ (Beresford, 2020, p. 2). This can include a diverse range of practical and emotional support to engage in activities, support with personal care, support for families and young people, and community development. In the UK, local authority social care services are required to demonstrate how evidence has informed their policy and practice (Lamont et al., 2020) and there are similar expectations in other economically developed countries (Head, 2010; Rycroft-Malone, 2008). These requirements assume that multiple forms of evidence can enhance practice, leading to better care and well-being. It is also hoped that evidence will elucidate the questions to be addressed in further research and practice development.

            Furthermore, there is a clear need for new initiatives to enhance learning and development cultures within social care. The challenges of engaging social care staff in working with evidence are well documented (Boote et al., 2010; Choi et al., 2005; Ghate and Hood, 2019). Resource constraints are often cited (Lamont et al., 2020; Orme and Powell, 2007), but challenges exist in several domains. There is limited infrastructure both within (Seddon et al., 2021) and outside social care organisations to support evidence work (Arnold, 2006; Inoue et al., 2017) and inconsistent approaches to evidence governance (Boody and Oliver, 2010). There can be a gap between the foci of academic research and the evidence that practitioners want (Huxley, 2009). In addition, there are workforce shortages (Lamont et al., 2020) and a dearth of evidence skills in the workforce (Orme and Powell, 2007), issues that are compounded by the complexity of undertaking research in social care (Ghate and Hood, 2019; McLaughlin, 2012). Moreover, there are few career pathways for practitioners interested in evidence work (Gazeley et al., 2019).

            Historically, initiatives to foster evidence work have focused predominantly on methods rather than processes and evidence has often been conceptualised as meaning only research, without consideration of other forms of knowledge. In contrast, the programme Developing Evidence Enriched Practice (DEEP – www.deepcymru.org) emphasises process rather than discrete methods. It also emphasises social justice values and the social construction of knowledge. Informed by social pedagogy principles and co-production, it promotes the use, synthesis and contextualisation of diverse forms of evidence to achieve a better life for people and communities and those working in social care. Evaluation (observing and reflecting for ongoing improvement) is a core component of social pedagogy practice and given the distinctiveness of DEEP in social care, there was an impetus to learn from what DEEP had contributed. Before presenting the evaluation, we will share more about the programme and the context in which it operated.

            Consistent with the values and practices in social pedagogy as referenced in Cameron et al. (2021), DEEP can be characterised as a holistic and values-based approach to learning and development based on story and dialogue. DEEP emerged from a participatory action research project (Andrews et al., 2015), which showed the value of thinking about evidence in more complex ways, that is, evidence as inclusive of lived experience, practitioner wisdom and knowledge generated through research. The project also highlighted the practical dilemmas of creating the learning environments where diverse forms of knowledge can be drawn together, especially in organisations imbued with managerial values and practices. Creating these learning spaces requires respecting how people learn and an openness to critically thinking, talking and sharing across epistemic divides. DEEP draws on the principles of social pedagogy to support the use of evidence in an ongoing and dynamic educational process.

            The initial Joseph Rowntree Foundation-funded DEEP project (Andrews et al., 2015) across six sites in Wales and Scotland started with a commitment to the collective exploration and application of evidence involving older people, unpaid carers, practitioners, researchers and managers. From the outset, it was clear that relationships and dialogue-learning would play a key part in the project’s success and connections were made with oracy experts in the University of Cambridge.

            Building on the success of the initial DEEP project, connections were later made with social pedagogy through a serendipitous meeting between the principal investigator and Gabriel Eichsteller in ThemPra in 2016. At that time, the Welsh government had invested further funding to support the national roll-out of the DEEP approach – the challenge being to articulate what this ‘approach’ was. This was where the synergy between DEEP and social pedagogy was recognised. Both are approaches rather than methods and, as such, need support with articulation. The diamond model provided a helpful framework to help articulate the DEEP approach. This model was incorporated into the eight DEEP principles and the DEEP training curriculum, which is framed around a gardening metaphor:

            • prepare the ground (creating good learning environments through applying social pedagogy principles)

            • gather and sow the seeds (of diverse types of evidence)

            • grow the garden (using caring dialogic-learning methods).

            The context in which DEEP operates

            The DEEP programme operates at a national level within complex systems in Wales. Wales is a devolved UK nation where social care is delivered by a range of public, private and not-for-profit providers. Social pedagogy is not widely known or applied in Wales and yet there are strong Welsh government policy imperatives around relational mutuality, and collective learning and development as outlined in the Social Services and Well-being (Wales) Act 2014 (Welsh Government, 2014) and the Well-being of Future Generations (Wales) Act 2015 (Welsh Government, 2015). The Welsh government’s commitment to this can be seen in the extension of funding for DEEP from 2023 to 2025 to work in partnership with Social Care Wales (the national social care workforce regulation and service improvement agency in Wales) to develop a strategic approach to DEEP across Wales.

            As DEEP embarked on its learning/evaluation journey, members of its steering group, which comprised practitioners, academics and interested citizens, identified enablers and challenges for the programme (see Table 1). They not only highlighted examples of positive social care practice in Wales, but also noted organisational practices that pull in a different direction to the values of social pedagogy. The evaluation period (2020–3) was also a time of significant strain on social care services in the UK due to the Covid-19 pandemic and the cost-of-living crisis contributing to workforce turnover, shortages and burnout (BBC, 2022; Senedd Research, 2021). As Cameron et al. (2021) have highlighted, although the policy context in Wales can be seen as supportive of social pedagogy approaches, there are systemic and institutional challenges in social care.

            Table 1

            Contextual analysis for the DEEP programme.

            DEEP operating environment
            EnablersChallenges
            A clear DEEP programme visionShort-term funding and decision-making
            Legislative support for innovationInnovative legislation not fully implemented
            Examples of positive social care practiceThe change process can be difficult

            Between 2020 and 2023 there were other UK initiatives in Wales supporting the generation and use of evidence in social care. These included the British Association of Social Work Practitioner Research Network and IMPACT – the UK centre for evidence implementation in adult social care, funded by the Economic and Social Research Council and the Health Foundation. DEEP was distinct from other initiatives in its combination of:

            • a foundation in the principles and practices of social pedagogy

            • a focus on dialogical tools (such as Exploratory Talk and Community of Enquiry)

            • an emphasis on relational and organisational learning and development processes.

            DEEP was arguably also more inclusive of diverse social care stakeholders. A literature search undertaken by the authors found papers detailing capacity-building interventions in social care emanating from the US, Australia, Canada, New Zealand, Sweden and the UK (for examples, see Harper and Dickson, 2019; Lachini et al., 2016; Lunt et al., 2012), but none of these reached out to collectively and simultaneously include citizens, social care staff, providers, policymakers and researchers.

            The DEEP programme
            Programme overview

            From 2015 DEEP was a work package in the Wales School for Social Care Research. Between 2020 and 2023 it was funded as a standalone programme by the Welsh government through Health and Care Research Wales (2022). It is currently funded by Social Care Wales. The 2020–3 programme had a full-time equivalent staff of 3.8.

            The DEEP approach

            Informed by social pedagogy, DEEP also draws on theoretical insights from community development, implementation science and research. DEEP aims to inform social care research and practice through multiple types of evidence. Consistent with social pedagogy, rather than constituting a precise method, the DEEP approach comprises five principle-based elements (see Table 2 and Andrews et al., 2020).

            Table 2

            The five elements of the DEEP approach.

            ElementExplanationPrinciplesTheoretical basis
            Create enriched environments of care and learningSafe, inclusive learning spaces where everyone feels welcomed, valued and able to contributeSupport well-being
            Start with what people know and find interesting
            Help people make sense of what they learn
            The SENSES Framework (Nolan et al., 2006)
            Human-Centred Learning (Lowe and Plimmer, 2019)
            Social Pedagogy (Hatton, 2013)
            Recognise and address structural obstaclesAddress systemic issues and consider context when using evidenceShare leadership and decision-making
            Be mindful of context when using evidence
            Applied phronesis (Flyvbjerg, 2001)
            Radical social work theory (Ferguson and Woodward, 2009)
            Complexity theory (Auspos and Cabaj, 2014; McMillan, 2008)
            Community development (Ife, 2018)
            Value and use a range of evidenceEvidence from lived experience, practice, research and organisational knowledgeShare all sorts of knowledgeKnowledge democracy (Beresford, 2018; Hall and Tandon, 2017)
            Participatory democracy (Escobar, 2011)
            Gather and present evidence in meaningful formatsCollect and share evidence in ways that engage the head and the heartUse stories in learningStorytelling and narrative-based learning (Bruner, 1991)
            Experience-based co-design (Bate and Robert, 2007)
            Effectively talk and think together about evidenceEngage in dialogue to enhance learning and promote implementationBe brave and challenge each other kindlySocial constructionism (Berger and Luckmann, 1966)
            Dialogue-learning and inter-think (Littleton and Mercer, 2013)

            A focus on strengths, assets and empowerment is integral in these elements. They seek to support well-being, foster meaningful relationships, create positive experiences and generate opportunities for holistic learning.

            Programme aims and objectives

            The 2020–3 programme aims were that the DEEP approach:

            • was clearly articulated and widely understood

            • was embedded in strategic initiatives in Wales

            • enhanced social care policy and practice

            • increased capacity in social care research

            • shared its evidence base.

            Objectives were detailed in five work packages, as seen in Table 3.

            Table 3

            Work packages and objectives.

            Work packagesObjectives
            Evaluation
            • Conduct a contribution analysis guided by the Matter of Focus approach

            Learning courses and DEEP catalyst role*
            • Develop a learning curriculum with stakeholders

            • Create a DEEP catalyst role

            Resources
            • Produce materials to help people understand and use the DEEP approach

            Social care priority areas
            • Co-facilitate events on priority topics

            Bridging research and practice
            • Contribute to the social care research infrastructure in Wales

            • Co-develop research ideas and priorities with stakeholders

            • Support research and practice development groups

            * DEEP catalysts are people trained to use and promote the DEEP approach

            Between 2020 and 2023 the programme team devised and delivered learning opportunities, held events on priority social care topics using methods such as Community of Enquiry and worked with individuals and organisations to help them use the DEEP approach in development projects. The programme team were active in supporting research and sat on several project steering groups as well as supporting research students to use aspects of the DEEP approach in their work. The team also engaged with the social care research infrastructure in Wales supporting capacity-building initiatives. For instance, the team contributed to the development of the Social Care Wales evidence offer. How DEEP operated was informed by recommendations in the research capacity building and the learning and development literature. The programme:

            • worked at different layers of the social care and research infrastructure (Huxley, 2009), working with colleagues who were policy leads, infrastructure staff, providers and practitioners

            • provided support in multiple ways (Donley and Moon, 2021; Harper and Dickson, 2019), including through experiential opportunities on learning courses, mentoring and facilitating peer support (Withington et al., 2020)

            • connected people (Rubin et al., 2016), focusing on establishing relationships (Fox and Hopkins, 2021), which included supporting research and practice development groups (RPDGs)

            • offered information and support that responded to ‘what mattered’ (Mignone et al., 2018) through learning courses and through working with individuals and organisations.

            Evaluation method

            The evaluation was designed with and supported by Matter of Focus. Matter of Focus is an evaluation approach based on contribution analysis (Matter of Focus, 2017). Contribution analysis is theory-based and seeks to explore whether an intervention has furthered noticeable outcomes and, if so, how (Mayne, 2012). Contribution analysis is a useful approach when interventions are multifaceted and work within complex systems (Matter of Focus, 2017). Contribution analysis is consistent with the values of social pedagogy, which emphasises learning and reflection rather than summative analyses that imply ‘endpoints’. The DEEP evaluation:

            1. co-developed a theory of change with the programme steering group and Matter of Focus (see Table 4)

            2. plotted a pathway illustrating how activities and outputs led to outcomes

            3. evidenced the pathway

            4. reviewed the evidence

            5. synthesised the learning as the basis for further reflection.

            Table 4

            DEEP theory of change.

            What we doWho withHow they need to feelWhat they learn and gainWhat they do differentlyWhat difference does this make?
            Catalyse dialogue

            Support people to share their evidence, amplifying marginalised voices

            Pilot and adapt a range of story and dialogue-based methods

            Help people to use the DEEP approach
            People who the DEEP programme can learn from

            People who want support to gather and/or use evidence

            Policymakers and national organisations in Wales
            Safe, supported, valued and listened to

            Open to reflection and new ideas

            That engaging with the DEEP approach is a good use of time
            A deeper valuing of different forms of evidence

            Skills and confidence in using the DEEP approach

            A greater understanding of their own context for using evidence
            Use the DEEP approach in their role

            Champion the DEEP approach

            People and their experiences are put at the centre of decision-making
            Practice, policy and research are informed by the DEEP approach

            The DEEP approach becomes widespread

            The DEEP approach has an accessible evidence base

            People in Wales experience better social care and improved well-being

            Swansea University College of Human and Health Sciences Research Ethics Committee (reference: 21072021) approved the data collection and analysis plan. All participants provided informed consent and were aged 18 or older. Evidence was collected from a range of stakeholders including citizens, practitioners, people working in regional roles, policy advisers and academics. Multiple forms of evidence were collected.

            Exemplar cases

            We captured the impacts achieved in exemplar engagements and identified facilitators and barriers to using the DEEP approach in practice. Impacts were captured through semi-structured online interviews. Each interview lasted approximately an hour and was recorded using the Microsoft Teams platform. Three exemplars were selected through team discussion:

            1. a project conducted by a health board and a third-sector (charitable) organisation to develop prevention services (two interviews);

            2. work in a local authority to develop policies to support unpaid carers (three interviews); and

            3. a project in a charity to capture evidence of change (three interviews).

            The key respondent in each exemplar was also invited to contribute evidence that illustrated the work being done and its impacts. In exemplars one and three, key respondents asked other staff members to provide short accounts of the positive effects that they had experienced, which were then shared anonymously with DEEP. In exemplar two, a poem that had been written as part of the work was shared.

            The collated evidence was analysed using framework analysis (Ritchie and Spencer, 1994), which can accommodate different types of data and is appropriate when there is a pre-specified exploration lens, such as the DEEP theory of change (Srivastava and Thomson, 2009). The analysis involved five stages: familiarisation; applying the DEEP theory of change as the thematic framework; indexing; charting; and mapping and interpretation.

            Questionnaire data

            A range of questionnaires were employed. These were completed anonymously online or in hard copy and comprised Likert scale items and open-ended questions. The questionnaires asked about the respondent’s experience of engaging with DEEP and how they intended to use their learning. Following the DEEP learning course, a three-month follow-up questionnaire explored whether practice intentions had been implemented. This questionnaire also gathered feedback on the mentoring and peer support provided. We used descriptive statistics to investigate Likert-scale responses and content analysis to explore free text questions. Content analysis considers manifest meaning units and common content categories, and our analysis was informed by the approach detailed by Graneheim and Lundman (2004).

            Documentary evidence

            Documents included polices, guidance documents and standards, as well as programme outputs. Documents were often provided by the people we worked with. If a document was not in the public domain, we sought permission for citations.

            Process data

            Process information included registration data for events and learning courses. This was analysed using descriptive statistics.

            Unsolicited feedback

            Unsolicited feedback helped capture unanticipated impacts and permission was obtained to quote this feedback.

            Synthesis

            OutNav (Matter of Focus, 2017) provided a platform for collating the data. Some sections of the DEEP theory of change were evidenced by a single data source (for example, questionnaire data) but most sections drew on multiple types of data. The synthesis considered data consistency and the extent of triangulation when determining the strength of evidence for each section of the theory of change.

            Findings

            The DEEP theory of change provides a structure for reporting the evaluation findings. Key learning and reflections are summarised in the discussion.

            What DEEP did and with whom

            Between April 2020 and December 2022, DEEP delivered 113 events about the DEEP approach and other topics that had arisen from the interests and needs of practitioners. Example events included a session on using the DEEP approach in parent advocacy work, a bitesize session on the DEEP principles and a workshop on participatory evaluation. Registrants included 1,131 practitioners, 670 managers, 29 policymakers, 71 researchers and 118 citizens.

            During 2020–3, DEEP connected with 175 organisations. Geographically, these connections covered all regions of Wales and reached into England, Scotland, Northern Ireland and parts of Europe, as well as the USA and Indonesia. We engaged with 16 of the 22 local authorities in Wales. Key partners were national social care organisations in Wales, including the social care regulator and service inspectorate.

            DEEP supported and mentored 53 DEEP catalysts, who attended a 10-week DEEP learning course. During 2020–3, it was challenging for social care staff to attend time-intensive learning opportunities and the attrition rate on this course was approximately 26 per cent. DEEP also supported at least 18 connections, where we linked together like-minded organisations and individuals, and two RPDGs.

            How the people we worked with needed to feel

            People needed to believe that engaging with the DEEP approach was worthwhile. Respondents who represented the range of stakeholders detailed above, indicated that DEEP events could be useful networking opportunities (mean rating 3.63 out of a maximum of 5). Comparing the number of returns with the number of registrants suggests the response rate to the events questionnaire was at least 19 per cent.

            It was also beneficial if people engaged with an open mind. It is hard to evidence this, but we observed people being receptive to new ideas. For instance, an event co-host provided this unsolicited feedback: ‘in particular, your presentation into DEEP and the impact of stories has really made them think differently, which is exactly what we wanted.’

            We investigated whether DEEP activities provided the elements for effective learning (security, belonging, continuity, purpose, achievement and significance) detailed in the Senses framework (Nolan et al., 2006). Mean ratings from events and the learning course are illustrated in Figure 1 and Figure 2.

            Figure 1

            Senses ratings provided by 387 event respondents.

            Figure 2

            Senses ratings provided by the 37 catalysts who completed the learning course.

            What people learned and gained

            The DEEP learning course covered the principles and methods of the DEEP approach. Attendees practised and gained experience in methods such as Most Significant Change (Davies and Dart, 2005) and Community of Enquiry (Muirhead, 2018), as well as understanding the DEEP approach’s underpinning principles, such as valuing all types of evidence. Bitesize sessions provided three-hour taster sessions in areas including the Senses framework, the DEEP principles and Community of Enquiry.

            Bitesize sessions could improve confidence, as attested to in this unsolicited feedback: ‘This session gave me the ability and confidence to use exploratory talk to help understand the story more.’ After attending the learning course, some catalysts also said that they felt more confident using the DEEP approach:

            I feel much more confident in some areas such that will help support the teaching and learning and the gathering of evidence. I feel the course has given me a better understanding of how research can be applied in practice. (Other role: assessor)

            However, not all catalysts felt confident at the end of the course, underlining the importance of the post-course mentoring.

            After engaging with DEEP, some people said that they could better appreciate the value of different forms of evidence. For example, a service manager reflected in exemplar one that they had learned the value of stories:

            Yeah we, I think, we’ve always captured numbers so, this was perhaps going, you know, going down a different road in terms of evaluating a project … but I think we’ve seen it as a more meaningful way of capturing outcomes.

            Event questionnaire feedback likewise suggested that respondents could come to understand different forms of evidence. A researcher wrote, ‘the importance of stories, no matter if they be good or bad. Stories enable us to improve as professionals and adapt the way that we work with people.’

            DEEP had also helped partners in exemplars one and two to develop a deeper understanding of their institutional context for using evidence. This example was provided by the project manager in exemplar one:

            and then the conversations about, you know, that about how much wider those panels need to be, who we need to involve in them. Strategically it was something like a thing that started with health and social care but that has grown into a … this needs to be reported through to, you know, the health board internally. It needs to go locally up to the public services board cos that’s a wider membership … but they’ve got kind of similar engagement themes and health and social care themes, so taking it wider than just the people that were originally involved in those discussions.

            However, it was not possible to triangulate this finding with evidence from questionnaires, documents or unsolicited feedback.

            What people did differently

            Event attendees said they intended to change their practice informed by the DEEP approach. For example, respondents wrote about adopting the values of working in a caring way and using story telling. A frontline staff member said: ‘it has given me knowledge of how to work in a caring way with individuals who have dementia, and also information that I can begin to pass on to my staff team’ while another frontline staff member wrote: ‘I want to use story telling more in my work’.

            Due to the discrete nature of DEEP events, we could not evidence whether these intentions were implemented. However, three months after the learning course, responses on the follow-up questionnaire indicated that catalysts were using aspects of the DEEP approach in their roles (11 per cent response rate). A catalyst who was a frontline staff member said, ‘I am currently completing a master’s course and I have used some of the techniques to describe how to explore the cultural climate within organisations.’ Another catalyst who identified themselves as being in an ‘other role’ commented that ‘my practice has changed as I now discuss the DEEP principles with more confidence to encourage others to think about how their actions impact on those they support’.

            The DEEP approach encourages organisations to put citizens at the centre of decision-making. The exemplars provided evidence of this shift in decision-making processes. The project manager in exemplar one said, ‘Yeah and, you know, for the panels this time, you know, we’ve got community representatives as well … we’ve trained up service users and carers to be on there.’ While a member of the commissioning team interviewed in exemplar two commented: ‘I would believe that the principles of putting the individual at the centre of thinking has been pretty much adopted across the board.’ Furthermore, a researcher interviewed in exemplar three shared, ‘but the thinking and the sharing of ideas and that sort of approach is sort of been, Yeah, we’re trying to look at doing that a bit more.’

            Event attendees also said that they intended to give more consideration to the views of people they work alongside. A frontline staff member intended to have ‘more discussions with residents and families’, and a manager wrote about ‘giving people a safe space to share their stories, which will assist in the development of services and giving those people participating a sense of purpose and the opportunity to develop services for those that follow them’. DEEP catalysts were able to apply this principle in their varied roles, with a researcher saying, ‘the course has changed my approach to teaching and supporting students, allowing students to use their experiences at the pace to design and plan learning’.

            Event attendees gave a mean rating of 4.59 (out of 5) to the likelihood that they would share their learning with others. A lead carers officer interviewed in exemplar two commented: ‘I’ve referred some colleagues across Wales to … the DEEP process as a good way, a good way of working.’ Catalysts said they would champion the DEEP approach through embodying its principles. A catalyst who was a frontline staff member said: ‘I will lead by example I hope and offer what I have gained to others’, and a catalyst who worked as an assessor said: ‘rather than telling learners and colleagues about DEEP principles I will practice them.’

            The difference made by these changes

            DEEP achieved proximal impacts in policy, practice and research.

            Policy

            A DEEP team member helped create a commissioned set of recommendations for unpaid carer short break services in Wales (Carers Trust Wales, 2021). The Welsh government agreed to implement two of the four recommendations made and committed £9 million to the recommended short breaks fund.

            Practice

            DEEP informed three national guidelines in Wales. Guidance (co-written with partners) on personal outcomes focused recording (Social Care Wales, 2020) was well received:

            Just a quick email to let you know that WG [Welsh Government] have included our ‘Friend not foe’ links into their qualitative evidence guidance and CIW [Care Inspectorate Wales] have agreed to endorse the resource … this is really down to this being a great piece of work that has been welcomed by the sector.

            The DEEP approach was also referenced in the Performance and Improvement Framework 2022 guidance for Social Services. This guidance supports local authorities to fulfil their reporting requirements under the Social Services and Well-being (Wales) Act 2014 (Welsh Government, 2014).

            The exemplars provided evidence of more local practice change. The project manager interviewed in exemplar one talked about how ‘they’ve started at every carer’s group meeting … they read a Most Significant Change story … to remind people why they’re there’. Meanwhile, the commissioning team member interviewed in exemplar two said ‘that principle of engagement and, co-production, you know, we’ve recognised the benefits from that work we did with [DEEP] and we’re making sure that we continue to do it in other service areas’. In addition, the head of learning and impact interviewed in exemplar three shared that ‘we are at the moment using Most Significant Change … So, we’re currently collecting the stories … and we’ve been collecting stories from commissioners, staff, participants, family members and we’re still in the process.’

            Research

            The DEEP team contributed to three successful submissions to UK research funders and co-supervised three Knowledge Economy Skills Scholarship II Masters by Research projects. Team members co-authored four research papers and three book chapters.

            Improved citizen well-being

            It was difficult to evidence the pathway from enhanced practice, policy and research to improved services and citizen well-being. However, we captured evidence that this could happen. A member of the commissioning team interviewed in exemplar two highlighted that working with the DEEP approach had contributed to a better specification for commissioning services for unpaid carers:

            We’ve built all that into the service spec so that when we actually went out to tender and commissioned, we were putting those people front and centre of the, of the service specification and making sure that what matters to them is prioritised as the most important factor in the service delivery.

            Exemplars one and two also provided insight into how practice change that resulted from using the DEEP approach could enhance citizen well-being. The lead carers officer interviewed in exemplar two said: ‘So, on the day there were carers that could be helped ... They’d identified that they needed help with a particular thing that was easily done on the day so there was an immediate result.’ In exemplar one, the community connector manager commented:

            I think on behalf of the person as well it’s about them being able to tell their story and know that someone’s going to listen to it and make a judgement if you like … so people feel as if their story’s important.

            Sustainability

            Sustainability can encompass the programme itself continuing, the incorporation of learning into policy and everyday practice, a physical infrastructure, lasting resource and continuance of the benefits (Jones and Verity, 2021). By 2023 the DEEP approach was becoming embedded within strategic partner organisations, and aspects of the approach were included in two social care training and qualification syllabuses in Wales. Other foundations for sustainability are outlined in Table 5. The passage of time will evidence whether these foundations are sufficient to sustain the DEEP approach in Wales.

            Table 5

            Foundations for sustaining the DEEP approach.

            Foundations of sustainabilityStrategiesOutcomes
            Incorporating the DEEP approach into policy and the ‘everyday’The DEEP approach included in training syllabuses
            DEEP catalyst role
            Contribution to policy and national guidance documents
            Two training syllabuses reference the DEEP approach
            53 DEEP catalysts working in Wales
            One policy and three guidance documents informed by the DEEP approach
            ResourcesDEEP learning course
            DEEP catalysts
            DEEP materials
            Learning course (N: 4) and bitesize sessions (N: 37) provided with accompanying materials
            Catalysts share and model the DEEP approach
            Continuing benefitsInherent characteristics of the DEEP approach
            Creating critical mass
            DEEP ‘teaches people to fish’
            At least 18 like-minded individuals and organisations connected

            Discussion: learning and reflections

            The collated evidence provides relatively strong evidence that the DEEP approach made proximal contributions. The programme increased people’s confidence in using the DEEP approach (though ongoing support was also important) and people identified a deepening in how they valued different types of evidence. Other capacity-building initiatives have reported similar findings, for example Mugabo et al. (2015) and Wenke and Mickan (2016). Others have expressed hope that approaches informed by social pedagogy would bring such impacts in social care in the UK. For instance, Hunter (2020, p. 8) proposed that social pedagogy could support the use of creative approaches with older adults by balancing ‘head, hands and hearts’. Increased confidence has also been reported as an outcome in other social pedagogy-informed initiatives (ThemPra, 2015b). There was weaker evidence that people gained a greater understanding about their context for using evidence. This might be an evaluation artefact as we did not ask about this on every questionnaire, but it was notable that DEEP catalysts did not identify this as an outcome.

            The evaluation suggests that the DEEP programme motivated people to put their learning into practice, which could include promoting the approach to others and ensuring that citizens were central in decision-making. This outcome was undoubtedly enhanced by the synergy between the DEEP approach and social care policy in Wales. Unsolicited feedback provided evidence that some of these intentions had been actioned, but the low response rate on the questionnaire that was sent three months after the learning course limits our understanding of the practice changes made by DEEP catalysts.

            Distal impacts were harder to evidence. There was strong evidence that the DEEP approach contributed to practice, policy and research as has been reported for other initiatives (for example, Harper and Dickson, 2019; Karlsson et al., 2008). We have evidence from the exemplars that the DEEP approach could support service improvement and enhance citizen well-being, but we cannot quantify how often this contribution occurred. Furthermore, although foundations for sustaining the DEEP approach were established, we cannot determine whether these foundations will be sufficient.

            The evaluation context is likely to have moderated the programme impacts. The Health Protection (Coronavirus Restrictions) (Wales) Regulations 2020 prohibited face-to-face events during some of 2020–2023. Given the relational nature of the DEEP approach, it is possible that greater impact would have been captured if more face-to-face events had been held. Indeed, event attendees gave comparatively low ratings for networking opportunities.

            Reflecting on the evaluation findings we conclude that the DEEP approach has put in place the foundations for successful engagement with different types of evidence. The DEEP programme created supportive learning environments, increased confidence and inspired people to take their learning into practice. A contributory factor to these impacts seemed to be the programme focus on relational and responsive processes rather than fixed and predetermined methods. The elements of DEEP and its principles seemed to resonate with many people working in social care. We observed that whenever social pedagogy principles were introduced through DEEP learning and development work, they were well received. Anecdotally they seemed to motivate and encourage people as they were an alternative to more methods-based learning approaches. The findings also highlighted that because DEEP is not a single method, like social pedagogy, it can be hard to communicate what it does. This suggests that incorporating more practice opportunities into our learning courses might further increase people’s confidence in using the approach.

            Strengths and limitations of the evaluation

            The theory of change that formed the basis for the evaluation was co-produced with the programme steering group. The evaluation covered a three-year period and incorporated multiple types of evidence. The contribution analysis approach recognised the multifaceted nature of the programme and the complex environment within which DEEP operated. The evaluation was not conducted independently, and to minimise bias:

            • the case exemplar data was collected and analysed by a team member not involved in the engagement

            • questionnaires were returned anonymously and some returns suggested areas for improvement, such as this suggestion from a frontline staff member following an event: ‘more participatory practice of the principles to help see it in action’

            • documentary evidence could be from independent sources

            • unsolicited feedback was included.

            As DEEP events were often discrete, we have weaker evidence that the DEEP approach was put into practice, and unfortunately there was a low response rate to the learning course three-month follow-up questionnaire. In future evaluations it would be beneficial to monitor if and how intended practice changes are implemented. This will also contribute learning about what impacts are sustained in the longer term.

            Implications for future work

            The learning from the evaluation can be distilled into six criteria that inform how to build capacity in the gathering and use of evidence. Although speaking to the social care sector, we believe these lessons could be transferable across sectors. To build foundations for better services and well-being, capacity-building initiatives in evidence work must be:

            1. relational: meaningfully engaging stakeholders who come from different world views

            2. holistic: covering all aspects of evidence work

            3. practice-focused: providing mentoring, experiential and tailored support

            4. multifaceted: working at individual, organisational and strategic levels

            5. co-produced: informed by intended beneficiaries and local needs

            6. contextualised: grounded in the specific issues and circumstances that call for evidence.

            Conclusion

            DEEP is an approach informed by social pedagogy principles and practices that takes a holistic and relational approach to building capacity in evidence work. It operates across the social care sector in Wales, reaching out to citizens, practitioners, policymakers and infrastructure staff. It has distinctive features in the context of previously evaluated initiatives and concurrent UK programmes.

            There is relatively strong evidence that the 2020–3 programme contributed to the acquisition of new knowledge and increased confidence in using the DEEP approach. Following our engagements, people intended to put the DEEP approach into practice and there was evidence that some practice change did result. These practice changes could improve social care services and enhance citizen well-being.

            The evaluation findings inform future capacity-building initiatives. Programmes that promote capacity in gathering and using evidence should be relational, holistic, practice-focused, multifaceted, contextualised and be co-produced with intended beneficiaries.

            Acknowledgements

            We thank the members of the DEEP steering group and the DEEP academic advisory group who supported the evaluation. Without their support the positive impacts that we describe could not have been achieved. We also thank Matter of Focus for their assistance in planning the evaluation and facilitating the development of the DEEP theory of change.

            Declarations and conflicts of interest

            Research ethics statement

            The authors declare that research ethics approval for the evaluation was provided by Swansea University College of Human and Health Sciences Research Ethics Committee.

            Consent for publication statement

            The author declares that research participants’ informed consent to publication of findings – including photos, videos and any personal or identifiable information – was secured prior to publication.

            Conflicts of interest statement

            At the time of acceptance of the article F. Verity had moved to Brunel University, London. The authors declare no conflicts of interest with this work. All efforts to sufficiently anonymise the authors during peer review of this article have been made. The authors declare no further conflicts with this article.

            References

            1. Andrews N, Gabbay J, Le May A, Miller E, Petch A, O’Neill M. Story, dialogue and caring about what matters to people: Progress towards evidence enriched policy and practice. Evidence and Policy. 2020. Vol. 16(4):597–618. [Cross Ref]

            2. Andrews N, Gabbay J, Le May A, Miller E, O’Neill M, Petch A. Developing evidence-enriched practice in health and social care with older people. Joseph Rowntree Foundation. 2015

            3. Arnold M. E. Developing evaluation capacity in extension 4-H field faculty: A framework for success. American Journal of Evaluation. 2006. Vol. 27(2):257–69. [Cross Ref]

            4. Auspos P, Cabaj M. Complexity and community change managing adaptively to improve effectiveness. Aspen Institute. 2014

            5. Bate P, Robert G. Bringing user experience to healthcare improvement: The concepts, methods, and practices of experience-based design. Radcliffe. 2007

            6. BBC. NHS Wales in dangerous and precarious state – BMA Cymru. 2022. https://www.bbc.co.uk/news/uk-wales-63067230

            7. Beresford P. Public participation in health and social care: Exploring the co-production of knowledge. Frontiers in Sociology. 2018. Vol. 3(41):1–12. [Cross Ref]

            8. Beresford P. PPI or user involvement: Taking stock from a service user perspective in the twenty first century. Research Involvement and Engagement. 2020. Vol. 6:36[Cross Ref]

            9. Berger P. L, Luckmann T. The social constructionism of reality: A treatise in the sociology of knowledge. Anchor Books. 1966

            10. Boody H, Oliver C. Research governance in children’s services: The scope for new advice. 2010. https://www.researchgate.net/profile/Janet_Boddy/publication/268058904_DFE-RB072_Research_governance_in_children’s_services_the_scope_for_new_advice/links/54db97b30cf28d3de65bbeb6.pdf

            11. Boote J, Baird N, Beecroft C. Public involvement at the design stage of primary health research: A narrative review of case examples. Health Policy. 2010. Vol. 95(1):10–23. [Cross Ref]

            12. Bruner J. The narrative construction of reality. Critical Inquiry. 1991. Vol. 18(1):1–21. [Cross Ref]

            13. Cameron C, Moss P, Petrie P. Towards a social pedagogic approach for social care. International Journal of Social Pedagogy. 2021. Vol. 10(1):7[Cross Ref]

            14. Carers Trust Wales. What a difference: A vision for the future of short breaks for unpaid carers in Wales. Carers Trust Wales. 2021

            15. Choi B. C. K, Pant T, Lin V, Puska P, Sherman G, Goddard M, Ackland M. J, Sainsbury P, Stachenk O. S, Morrison H, Clottey C. Can scientists and policymakers work together? Journal of Epidemiology and Community Health. 2005. Vol. 59(8):632–37. [Cross Ref]

            16. Davies R, Dart J. The ‘most significant change’ technique: A guide to its use. Care International. 2005

            17. Donley E, Moon F. Building social work research capacity in a busy metropolitan hospital. Research on Social Work Practice. 2021. Vol. 31(1):101–7. [Cross Ref]

            18. Escobar O. Public dialogue and deliberation: A communication perspective for public engagement practitioners. Edinburgh Beltane. 2011

            19. Ferguson I, Woodward R. Radical social work in practice: Making a difference. Bristol University Press. 2009

            20. Flyvbjerg B. Making social science matter: Why social inquiry fails and how it can succeed again. Cambridge University Press. 2001

            21. Fox M, Hopkins D. Building research capacity in hospital-based social workers: A participatory action research approach. Qualitative Social Work. 2021. Vol. 22(1):123–39. [Cross Ref]

            22. Gazeley L, Lofty F, Longman P, Squire R. Under-tapped potential: Practitioner research as a vehicle for widening participation. Journal of Further and Higher Education. 2019. Vol. 43(7):1008–20. [Cross Ref]

            23. Ghate D, Hood R. Using evidence in social care. What works now? Evidence-informed policy and practice. Boaz A, Davies H, Fraser A, Nutley S. Policy Press. 2019. p. 89–104

            24. Graneheim U. H, Lundman B. Qualitative content analysis in nursing research: Concepts, procedures, and measures to achieve trustworthiness. Nurse Education Today. 2004. Vol. 24(2):104–12. [Cross Ref]

            25. Hall B, Tandon R. Decolonization of knowledge, epistemic ide, participatory research and higher education. Research for All. 2017. Vol. 1(1):6–19. [Cross Ref]

            26. Harper L. M, Dickson R. Using developmental evaluation principles to build capacity for knowledge mobilisation in health and social care. Evaluation. 2019. Vol. 25(3):330–48. [Cross Ref]

            27. Hatton K. Social pedagogy in the UK: Theory and practice. Russell House. 2013

            28. Head B. W. Reconsidering evidence-based policy: Key issues and challenges. Policy and Society. 2010. Vol. 29(2):77–94. [Cross Ref]

            29. Health and Care Research Wales. Making research careers work: A review of career pathways in health and social care in Wales. HCRW. 2022

            30. Hunter R. Older people and creativity: What can a social pedagogical perspective add to this work? International Journal of Social Pedagogy. 2020. Vol. 9(1):8[Cross Ref]

            31. Huxley P. Social care research priorities and capacity in Wales: A consultation exercise. Swansea University Centre for Social Work and Social Care Research School of Human Sciences. 2009

            32. Ife J. Community development in an uncertain world. Cambridge University Press. 2018

            33. Inoue M, Tsai L. C, Lee J. S, Ihara E. S, Tompkins C. J, Aguimatang J, Fountain K, Hudson S. Teaching note: Creating an integrative research learning environment for BSW and MSW students. Journal of Social Work Education. 2017. Vol. 53(4):759–64. [Cross Ref]

            34. Jones M. G, Verity F. Rethinking sustainability in childhood obesity prevention interventions: Learning from South Australia’s Obesity Prevention and Lifestyle (OPAL) programme. Health Promotion International. 2021. Vol. 37(1):1–14. [Cross Ref]

            35. Karlsson P.-Å, Beijer E, Eriksson B, Leissner T. Evaluation workshops for capacity building in welfare work: Some Swedish examples. Evaluation. 2008. Vol. 14(4):483–98. [Cross Ref]

            36. Kirkwood S, Roesch-Marsh A, Cooper S. Evaluating social pedagogy in the UK: Methodological issues. Qualitative Social Work. 2019. Vol. 18(1):8–23. [Cross Ref]

            37. Lachini A. L, Clone S, Dehart D. D, Seay K. D, Browne T. Project STRONG: A capacity-building intervention to improve grant writing among substance abuse organisations. Journal of Social Work Practice in the Addictions. 2016. Vol. 16(4):403–20. [Cross Ref]

            38. Lamont T, Allen A, Geoghegan L, Clark M, Goulding C, Manthorpe J. Enabling better use of evidence in social work policy and practice. BASW and NIHR. 2020

            39. Littleton K, Mercer N. Inter-thinking: Putting talk to work. Routledge. 2013

            40. Lowe T, Plimmer D. Exploring the new world: Practical insights for funding, commissioning, and managing in complexity. Tudor Trust. 2019

            41. Lunt N. T, Ramian K, Shaw I, Fouché C, Mitchell F. Networking practitioner researchers: Synthesis of the state of the ‘art’. European Journal of Social Work. 2012. Vol. 15(2):185–203. [Cross Ref]

            42. Matter of Focus. Our approach. Learn, improve, evidence, and tell the story of the difference you make. 2017. https://www.matter-of-focus.com/our-approach/

            43. Mayne J. Contribution analysis: Coming of age? Evaluation. 2012. Vol. 18(3):270–80. [Cross Ref]

            44. McLaughlin H. Understanding social work research. 2nd ed. Sage. 2012

            45. McMillan E. Complexity management and the dynamics of change. Routledge. 2008

            46. Mignone J, Hinds A, Duncan K. A, Migliardi P, Krawchuk M, Kimasevych B. One-room school: The summer institute in program evaluation. Canadian Journal of Program Evaluation. 2018. Vol. 33(2):268–78. [Cross Ref]

            47. Mugabo L, Rouleau D, Odhiambo J, Nisingizwe M. P, Amoroso C, Barebwanuwe P, Warugaba C, Habumugisha L, Hedt-Gauthier B. L. Approaches and impact of non-academic research capacity strengthening training models in sub-Saharan Africa: A systematic review. Health Research Policy and Systems, 13, 30. 2015. [Cross Ref]

            48. Muirhead S. Community of enquiry guide. IRISS. 2018

            49. Nolan M, Brown J, Davies S, Nolan J, Keady J. The senses framework: improving care for older people through a relationship-centred approach, Getting Research Into Practice (GRIP) project report no 2. University of Sheffield. 2006

            50. Orme J, Powell J. Building research capacity in social work: process and issues. British Journal of Social Work. 2007. Vol. 38(5):988–1008. [Cross Ref]

            51. Petrie P. Social pedagogy in the UK: gaining a firm foothold? Education Policy Analysis Archives. 2013. Vol. 21(37):1–13. [Cross Ref]

            52. Ritchie J, Spencer L. Qualitative data analysis for applied policy research. Analysing qualitative data. Bryman A, Burgess R. G. Routledge. 1994. p. 173–94

            53. Rubin C. L, Martinez L. S, Tse L, Brugge D, Hacker K, Pirie A, Leslie L. K. Creating a culture of empowerment in research: findings from a capacity-building program. Program of Community Health Partnerships. 2016. Vol. 10(3):479–88. [Cross Ref]

            54. Rycroft-Malone J. Evidence-informed practice: From individual to context. Journal of Nursing Management. 2008. Vol. 16(4):404–8. [Cross Ref]

            55. Seddon D, Toms G, Verity F. Social care: Research, policy, and practice in Wales. Social policy for welfare practice in Wales: New directions. 3rd ed. Gwilym H, Williams C. British Association of Social Workers. 2021. p. 19–35

            56. Senedd Research. Social care: A system at breaking point? 2021. https://research.senedd.wales/research-articles/social-care-a-system-at-breaking-point/

            57. Social Care Wales. Friend not foe: Supporting meaningful outcome focused recording in social care in Wales. Welsh Government. 2020

            58. Srivastava A, Thomson S. B. Framework analysis: A qualitative methodology for applied policy research. Journal of Administration & Governance. 2009. Vol. 4(2):72–9

            59. ThemPra. Overview of social pedagogy in the UK. 2015a. https://www.thempra.org.uk/resources/overview-of-social-pedagogy-in-the-uk/

            60. ThemPra. Social pedagogy impact. 2015b. https://www.thempra.org.uk/social-pedagogy/social-pedagogy-impact/

            61. Welsh Government. Social Services and Well-being (Wales) Act 2014. The Stationery Office Limited. 2014

            62. Welsh Government. Well-being of Future Generations (Wales) Act 2015. The Stationary Office Limited. 2015

            63. Wenke R, Mickan S. The role and impact of research positions within health and care settings in allied health: A systematic review. BMC Health Services Research. 2016. Vol. 16(1):335[Cross Ref] [PubMed]

            64. Withington T, Alcorn N, Maybery D, Goodyear M. Building research capacity in clinical practice for social workers: A training and mentorship approach. Advances in Mental Health. 2020. Vol. 18(1):73–90. [Cross Ref]

            Author and article information

            Journal
            IJSP
            International Journal of Social Pedagogy
            UCL Press
            2051-5804
            10 January 2024
            : 13
            : 1
            : 1
            Affiliations
            [1 ]School of Medical and Health Sciences, Bangor University, UK
            [2 ]Swansea University, UK
            Author notes
            [* ]Correspondence: g.toms@ 123456bangor.ac.uk
            Author information
            https://orcid.org/0000-0001-5553-573X
            https://orcid.org/0000-0002-7354-4397
            https://orcid.org/0000-0002-4166-2794
            https://orcid.org/0000-0003-2855-8105
            Article
            IJSP-13-1
            10.14324/111.444.ijsp.2024.v13.x.001
            cbd1c513-753f-48a7-90ea-13e193940d1a
            2024, Gill Toms, Fiona Verity, Nick Andrews and Richenda Leonard.

            This is an open-access article distributed under the terms of the Creative Commons Attribution Licence (CC BY) 4.0 https://creativecommons.org/licenses/by/4.0/, which permits unrestricted use, distribution and reproduction in any medium, provided the original author and source are credited • DOI: https://doi.org/10.14324/111.444.ijsp.2024.v13.x.001.

            History
            : 25 August 2023
            : 21 November 2023
            Page count
            Pages: 17
            Funding
            Funded by: Welsh government through Health and Care Research Wales
            Award ID: 2020–3
            Funded by: Social Care Wales
            Award ID: 2023–5
            The Developing Evidence Enriched Practice programme 2020–3 was funded by the Welsh government through Health and Care Research Wales. The 2023–5 programme is funded by Social Care Wales.
            Categories
            Research article
            Custom metadata
            Toms, G., Verity, F., Andrews, N. and Leonard, R. (2024). Going DEEP: an evaluation of a social pedagogy informed approach to evidence enriched practice in social care. International Journal of Social Pedagogy, 13(1): 1. DOI: https://doi.org/10.14324/111.444.ijsp.2024.v13.x.001.

            Sociology,Education,Social policy & Welfare,General social science,General behavioral science,Family & Child studies
            social care,Developing Evidence Enriched Practice (DEEP),capacity building,contribution analysis,evaluation

            Comments

            Comment on this article