11,595
views
0
recommends
+1 Recommend
1 collections
    3
    shares
      scite_
       
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      How can impact strategies be developed that better support universities to address twenty-first-century challenges?

      research-article
      Bookmark

            Abstract

            To better address twenty-first-century challenges, research institutions often develop and publish research impact strategies, but as a tool, impact strategies are poorly understood. This study provides the first formal analysis of impact strategies from the UK, Canada, Australia, Denmark, New Zealand and Hong Kong, China, and from independent research institutes. Two types of strategy emerged. First, ‘achieving impact’ strategies tended to be bottom-up and co-productive, with a strong emphasis on partnerships and engagement, but they were more likely to target specific beneficiaries with structured implementation plans, use boundary organisations to co-produce research and impact, and recognise impact with less reliance on extrinsic incentives. Second, ‘enabling impact’ strategies were more top-down and incentive-driven, developed to build impact capacity and culture across an institution, faculty or centre, with a strong focus on partnerships and engagement, and they invested in dedicated impact teams and academic impact roles, supported by extrinsic incentives including promotion criteria. This typology offers a new way to categorise, analyse and understand research impact strategies, alongside insights that may be used by practitioners to guide the design of future strategies, considering the limitations of top-down, incentive-driven approaches versus more bottom-up, co-productive approaches.

            Main article text

            Key messages
            When designing an institutional impact strategy, it is important to consider the limitations of top-down, incentive-driven ‘enabling impact’ approaches versus more bottom-up, co-productive ‘achieving impact’ approaches, and how the most appropriate elements of both types of strategy may be combined in a given institutional context.
            ‘Enabling impact’ strategies tend to build impact capacity and culture across an institution or unit, with a strong focus on partnerships and engagement, and to invest in dedicated impact teams and academic impact roles, supported by extrinsic incentives.
            ‘Achieving impact’ strategies tend to also have a strong emphasis on partnerships and engagement, but they are more likely to target specific beneficiaries with structured implementation plans, use boundary organisations to co-produce research and impact, and recognise impact with less reliance on extrinsic incentives.

            Introduction

            As the world faces many new and complex challenges, research funders and governments are increasingly seeking evidence that public investment in research leads to wider societal impacts. For higher education and research institutions, the rise of this ‘impact agenda’ has generally been incorporated in formal systems and policies designed to assess the quality of research. In the UK, for example, impact is assessed retrospectively via the Research Excellence Framework (REF), and similar systems exist in Australia, Hong Kong, the United States, Sweden, Italy, Spain and elsewhere (Geuna and Piolatto, 2016; Chubb and Reed, 2018; Heyeres et al., 2019; Reichard et al., 2020; Reed et al., 2021). Although formal evaluation of impact has received criticisms related to negative unintended consequences for individual researchers and research priorities (Chubb and Watermeyer, 2017; Watermeyer, 2019; Reed and Fazey, 2021), incentivising research impact generally leads to increased public funding, as well as to more accountable research that has long-term benefits for society (Hill, 2016; Chubb and Reed, 2017; Reichard et al., 2020; Reed et al., 2021).

            Research impact policy is part of a broader trend of seeing universities at the heart of a knowledge economy (Chubb et al., 2017), and higher education and research institutions are responding through increased investment in impact and the associated capacity required (Oancea, 2019; Watermeyer, 2019). Strategic institutional responses to the impact agenda have varied. Some have developed new and innovative institutional structures to enhance the use of science in policy and practice (for example, Bruce and O’Callaghan, 2016; Cvitanovic et al., 2018), while others have created new organisational roles such as ‘knowledge brokers’ to improve knowledge exchange between scientists and policymakers (Meyer, 2010; Cvitanovic et al., 2018). However, a common response has been the integration of impact strategies (or similar) into university-wide policy and practice, reflecting the growing trend for accountability within the university system to research funders and users (Penfield et al., 2014). As a result, impact is now widely seen as an important part of an institution’s research culture (Leeuwis et al., 2018), and a small but growing body of literature is beginning to understand how the impact agenda is shaping organisational cultures across the sector (for example, Moran et al., 2020; Rickards et al., 2020; Reed and Fazey, 2021). There is also increasing recognition that an institution’s impact culture needs to go beyond simply monitoring and evaluating impact (MacGregor et al., 2020; MacGregor and Phipps, 2020). Instead, an institution needs to develop conditions at both individual and organisational levels for generating impact, including the skills and capability to conduct action-oriented and robust research to underpin impact, and the generation of social capital within stakeholder networks and partnerships to address societal challenges (MacGregor et al., 2020; Reed and Fazey, 2021).

            Although there is a wealth of literature that has explored the process of research impact assessment in different countries and in different disciplines (for example, Penfield et al., 2014; Cook et al., 2017; Joly and Matt, 2017; Adam et al., 2018; Weißhuhn et al., 2018; Tsey et al., 2019), there has been no international study to date that systematically analyses higher education and research institutions’ own impact strategies. As a result, impact strategies are poorly understood, and there is no formal understanding of the different approaches that institutions are taking to developing their impact culture. There is also a lack of any critical appraisal of the strengths and weaknesses of current strategic planning and practice. There are no existing publications that set out the range of practices proposed under current institutional impact strategies, and this limits the ability of higher education and research institutions to learn about the diversity of approaches taken elsewhere. Given the potential for well-designed impact strategies to enable institutions and their researchers to address the complex challenges that we face in the twenty-first century, it is important to address this knowledge gap.

            In response, this study explores what types of impact strategies higher education and research organisations are adopting to drive impact from research. The work provides the first formal analysis of impact strategies from around the world, informing a typology of strategies that can be used to guide the development of future institutional impact strategies. In considering how future strategies might build institutional impact capacity and impact culture more effectively, the findings suggest which mechanisms might offer the most potential to transform how institutions operate in this space and enable researchers to address twenty-first-century challenges.

            Methods

            A total of 77 impact strategies were analysed: 37 from the UK, 9 from Canada, 8 from Australia, 8 from Hong Kong, 8 from Denmark, 2 from New Zealand and 5 from independent research organisations (Table 1). The majority (66 strategies) were for higher education institutions (mainly universities); the sample included 19 per cent, 9 per cent, 19 per cent, 100 per cent, 88 per cent and 25 per cent of higher education institutions in the UK, Canada, Australia, Hong Kong, Denmark and New Zealand respectively. (However, note the different approach to identifying impact strategies for Hong Kong, China and Denmark below, which may account for the higher proportion found in these jurisdictions.) Two strategies were found representing sub-units within UK universities (a college and research institute operating within universities), four were cross-institutional research programmes or centres (N8 AgriFood, ClimateXChange Centre of Expertise, WISERD and Third Sector Research Centre) and five were independent research institutes (Institute for Development Studies, Plymouth Marine Laboratory, Stockholm Environment Institute, CGIAR and CERN).

            Table 1.

            Strategies included in the sample (Source: Authors, 2022)

            NameOrganisational unitDocument nameDate, where statedDedicated impact strategyPart of a wider strategy
            UK academic institutions, programmes and units
            Aberystwyth UniversityUniversityTowards the Next 150 Years: Aberystwyth University Strategic Plan 2018–20232018
            Anglia Ruskin UniversityUniversityResearch and Innovation Strategy 2018–20222018
            Brunel University LondonUniversityBrunel 2030: A university for a changing worldNot given
            ClimateXChange Centre of ExpertiseCross-university research programmeA Knowledge Exchange Model for Research, Policy and Practice2016
            College of Arts, Humanities and Social Sciences, University of EdinburghCollege within a universityStrategy for Research and Knowledge Exchange2016
            De Montfort UniversityUniversityResearch Strategy 2018–20232018
            Durham UniversityUniversityDurham University Strategy 2017–20272017
            Goldsmiths, University of LondonUniversityGoldsmiths’ Strategy 2018–20232018
            Imperial College LondonUniversityPathways to Societal Impact2016
            Institute of Health and Wellbeing, University of GlasgowResearch institute within a universityPublic Engagement and Knowledge Exchange Strategy April 20122012
            Keele UniversityUniversityKeele Research Strategy2020
            King’s College LondonUniversityKing’s Strategic Vision 2029Not given
            Leeds Trinity UniversityUniversityResearch Strategy 2018–212018
            London Metropolitan UniversityUniversityStrategy 2019/20–2024/252019
            Manchester Metropolitan UniversityUniversityResearch and Knowledge Exchange Strategy2017
            N8 AgriFoodCross-university research programmeN8 Agrifood Theory of Change and Logic Model2017
            Norwich University of the ArtsUniversityResearch Strategy 2015–20202015
            Queen Mary University of LondonUniversityStrategy 2030Not given
            Sheffield Hallam UniversityUniversityImpact Strategy for Research and Knowledge ExchangeNot given
            SOAS University of LondonUniversitySOAS Vision and Strategy 2016–20202016
            The London School of Economics and Political ScienceUniversityLSE 2030 Strategy2019
            Third Sector Research CentreCross-university research centreKnowledge Exchange, Communication & Impact Strategy2010
            University of BathUniversityLooking Further University Strategy 2016–20212016
            University of BrightonUniversityResearch and Enterprise Strategic Plan 2017–20212017
            University of ExeterUniversityResearch and Impact Strategy 2015–202015
            University of GlasgowUniversityKnowledge Exchange and Innovation Strategy 2017–2021: Changing the world through engagement – innovation – impact2017
            University of LincolnUniversityThinking Ahead 2016–2021: University of Lincoln strategic plan2016
            University of LiverpoolUniversityOur Strategy 20262016
            University of OxfordUniversityStrategic Plan 2018–232018
            University of PortsmouthUniversityResearch and Innovation Strategy 2015–20202015
            University of South WalesUniversityResearch Strategy 2018–20282018
            University of SunderlandUniversityImpact StrategyNot given
            University of the Highlands and IslandsUniversityResearch, Impact and Knowledge Exchange Strategic Plan 2018–20232018
            University of the West of EnglandUniversityResearch Impact Strategy2015
            University of WarwickUniversityOur Research StrategyNot given
            Ulster UniversityUniversityResearch & Impact Strategy 2017–20222017
            Wales Institute of Social and Economic Research and Data (WISERD)Cross-university research centreEngaging for Impact – WISERD’s Knowledge Exchange Strategy 2012–20152012
            Australian universities
            Charles Darwin UniversityUniversityConnect Discover Grow2015
            La Trobe UniversityUniversityResearch Impact Strategy 2019–20222019
            Queen’s UniversityUniversityOur Future is You: A strategic plan for our shared future 2019–20232019
            Monash UniversityUniversityResearch Agenda 20202019
            University of QueenslandUniversityStrategic Plan 2018–20212018
            University of MelbourneUniversityResearch at Melbourne: Ensuring excellence and impact to 2025Not given
            University of WollongongUniversityResearch Impact Strategy: White Paper2019
            Victoria UniversityUniversityStrategic Plan 2016–20202016
            New Zealand universities
            Massey UniversityUniversityResearch Strategy 2018–20222018
            University of AucklandUniversityResearch Impact Strategy2019
            Canadian universities
            Athabasca UniversityUniversityStrategic Research Plan 2018–20222018
            Memorial University of NewfoundlandUniversityPublic Engagement Framework2012
            Queen’s UniversityUniversityStrategic Research Plan 2018–20232017
            University of British ColumbiaUniversityEnhancing KMb@UBC: Mobilizing UBC research in the policy realm2017
            University of CalgaryUniversity2018–23 Research Plan2018
            University of ManitobaUniversityStrategic Research Plan 2015–20202015
            University of OttawaUniversityResearch with Impact: Knowledge mobilization institutional strategy 2019–20212019
            University of ReginaUniversityAll Our Relations Strategic Plan 2020–20252020
            University of WaterlooUniversityConnecting Imagination with Impact2020
            Hong Kong higher education institutes
            Chinese University of Hong KongUniversityCUHK Strategic Plan 2016–20202016
            City University of Hong KongUniversityStrategic Plan 2020–2025: World-class research and education2020
            Education University of Hong KongUniversityStrategic Plan 2016–20252016
            University of Hong KongUniversityAsia’s Global University: The next decade – Our vision for 2016–20252016
            Hong Kong Polytechnic UniversityUniversityShaping the Future: Strategic plan 2019/20–2024/252019
            Hong Kong Baptist UniversityUniversityStrategic Plan 2018–2028: Climb high, gaze far2018
            Hong Kong University of Science and TechnologyUniversityHKUST 5 Year Strategic Plan 2020: Innovating today, imagining tomorrow2020
            Lingnan UniversityUniversityStronger, Higher: 2019–2025 strategic plan2019
            Danish higher education institutes
            University of CopenhagenUniversityTalent and Collaboration: Strategy 20232017
            Copenhagen Business SchoolHigher education instituteCBS Strategy2020
            Aarhus UniversityUniversityAarhus University Strategy 20252020
            Syddansk UniversitetUniversitySDU's Fundamental Narrative: ‘Our SDGs’Not given
            Aalborg UniversityUniversityKnowledge for the World. Aalborg University strategy 2016–20212016
            Roskilde UniversityUniversityStrategy RUC 2030: InterconnectedNot given
            Technical University of DenmarkUniversityStrategy 2020–2025: Technology for people2020
            IT University of DenmarkUniversityITU Strategy 2022–20252021
            Independent research institutes
            CERNIndependent research institute2020 Update of the European Strategy for Particle Physics by the European Strategy Group2020
            CGIARIndependent research instituteResearch Strategy 2030: Ending hunger by 2030 through science to transform food, land and water systems in a climate crisisNot given
            Institute of Development StudiesIndependent research instituteEngaged Excellence for Global Development Strategy 2015–202015
            Plymouth Marine LaboratoryIndependent research institutePML Research Impact Plan 2020–20252020
            Stockholm Environment InstituteIndependent research instituteStrategy 2020–20242020

            Impact strategies were identified in four ways. First, as non-peer-reviewed texts, a non-academic internet search engine (Google) was used to identify impact strategies by combining search terms including ‘impact’, ‘knowledge exchange’, ‘knowledge mobilisation’ and ‘knowledge transfer’ with ‘strategy’, ‘strategic plan’, ‘university’, ‘higher education’, ‘research’, ‘research centre’ and ‘research institute’. Second, strategic documents were sourced via international email lists, including the JISCMail International Impact Network (which has a bias towards Australia), and the Association for Research Administrators and Managers (exclusive to the UK) and Fast Track Impact mailing lists (with global coverage, but with a bias towards the UK, Australia and New Zealand). Third, additional strategies known to the authors (including two unpublished strategies that were made available for the analysis) were included in the sample.

            Finally, to evaluate whether strategies might have been missed because searches were conducted only in English, university strategies were systematically assessed for all universities in two non-English speaking countries, Denmark and Hong Kong, China (translating strategies to English for analysis where necessary). These were selected on the basis of one author’s in-depth knowledge of the Danish higher education system and the well-developed impact evaluation system in Hong Kong. Of the higher education institutions in both jurisdictions, all but one (which did not have a publicly available detailed strategy) included substantive impact goals and activities in their institutional strategies, and they were brought into the sample.

            Inclusion criteria were that strategies: (1) were for a unit or institution that conducts its own research (strategies by networks such as Research Impact Canada and funding organisations were excluded); (2) were in use at the time of the analysis, or the time horizon over which impacts were planned was recent; and (3) had dedicated section(s) and/or substantive goals and activities about (rather than just passing references to) research impact strategy. Four strategies were excluded from the analysis on the basis of these criteria. Where a dedicated impact strategy was available, this was analysed instead of the wider research or institutional strategy (for example, the University of the West of England, Bristol), unless impact strategies for that institution were designed for a single unit within it (for example, the University of Glasgow’s Knowledge Exchange and Innovation Strategy 2017–21 was analysed in addition to the Public Engagement and Knowledge Exchange Strategy of the University of Glasgow’s Institute of Health and Wellbeing). One of the documents analysed was a report making recommendations for university strategy to enhance impact (the University of British Columbia), and two were draft strategies (the University of Auckland and Plymouth Marine Laboratory).

            Only text pertaining to research impact was extracted and analysed from the strategies. This therefore excluded, for example, information about sustainability initiatives (for example, in estates) not linked to research, and impact from teaching where it was not explicitly linked to research (for example, widening participation initiatives). All searches and requests were done in English. This led to a bias towards institutions in English-speaking countries, leading to the specific inclusion of strategies from two non-English speaking jurisdictions: Denmark and China (Hong Kong).

            The thematic analysis approach outlined by Braun and Clarke (2006) was used to undertake in-depth analysis of strategies, using Microsoft Excel to manage the data. All qualitative analysis was conducted by the lead author to maintain consistency across the sample. Themes were developed a posteriori during an initial open coding phase, as described in Braun et al. (2015), with themes being subdivided and integrated when sub-themes or repeating themes emerged during the analysis. Axial coding (grouping and abstracting data into categories) was then used to organise themes into theoretical constructs that informed the development of the typology in the next section. Points have been illustrated with quotations to provide examples of the themes that emerged. As the sample consists mainly of publicly available texts, these are not anonymised, unless the strategy was provided for analysis on the condition of confidentiality.

            Methods for an additional quantitative analysis based on word frequencies are available in a pre-print version of this paper, alongside a database of all publicly available strategies analysed in this paper, at www.fasttrackimpact.com/impactstrategies.

            Results

            Results for the word frequency analysis can be found in the pre-print version of the paper at www.fasttrackimpact.com/impactstrategies. These findings show substantive differences between the language used in different jurisdictions; for example, the word ‘Indigenous’ appeared more frequently in Australian and Canadian strategies. There were also differences in language between dedicated impact strategies and those that integrated impact into wider institutional strategies and plans, and between university strategies and more specialised sub-units within universities, cross-institutional research programmes and independent research institutions. To further explore comparisons and provide more in-depth lessons from these findings, the qualitative thematic analysis identified six themes that occurred across all strategies, and analysed findings on each: engagement and partnerships, co-production and boundary organisations, resourcing for impact, impact training, monitoring and evaluation, and impact culture. These are each explored in the rest of this section.

            Engagement and partnerships

            Building and maintaining relationships is established as one of the most valuable precursors to achieving research impact (Stanley, 2016), and so it is not surprising that the most prevalent theme across all the strategies reviewed was partnerships and stakeholder and public engagement (it was a key theme in 51 out of 77 strategies). However, approaches to engagement and partnerships varied widely across the strategies reviewed. This can be dictated by considerations of place or impact types, and while most contained descriptions of planned activities, some addressed the importance of needs analysis or planning to improve the nature of relationships on an ongoing basis.

            University strategies identified partnerships with organisations across the local region, and at national and international scales. Engagement within the university’s city and region was a common theme, and this was particularly prevalent in London-based universities. For example, the strategy for King’s College London aspired by 2029 to ‘be regarded throughout the world as London’s leading civic university … making a valuable contribution to the capital’s health and success through a wide range of collaborations that both draw London into King’s and put King’s expertise to work in productive ways that have meaning for London’.

            Partnerships with business and industry partners featured in strategies across the sample. Research and innovation clusters (also named ‘precincts’) were mentioned in strategies for Monash University and the University of Melbourne. The University of Oxford was engaging in expansion of innovation districts in and around Oxford, a number of universities collaborated in science parks (for example, the Universities of Lincoln, Exeter and Durham were investing in Lincoln Science Park, Exeter Science Park and NETPark respectively, and City University of Hong Kong was running an incubation programme at Shatin Science Park), and Anglia Ruskin University was one of many universities that ran shared spaces for the co-location of start-ups and applied research groups. These are geographical areas, typically in the same city or state as the university, where universities and companies cluster (often including co-location in dedicated buildings), pool facilities and expertise, and connect with start-ups and business incubators, facilitating economic and social development. In the UK, similar proposals were made for engaging with Local Enterprise Partnerships to establish collaborative spaces where researchers and business could drive the local economy, or the creation of ‘catapults’ and incubators to drive innovation in collaboration with local industry. The University of Melbourne strategy explained how this works:

            To foster productive research collaborations, a key starting point is the development of strong clusters of research activity which bring together people and infrastructure in productive ecosystems. … With appropriate settings, these clusters can become ‘research precincts’ – a powerful means of harnessing collaborations and boosting innovation effectiveness. … Precincts offer a way to reach across and beyond organisational boundaries to generate far greater impact on challenging problems than the University could achieve alone. They can have a physical centre and be linked to nodes in different geographical locations.

            In some cases, generation of new income streams motivated the development of partnerships. For example, La Trobe University explained how ‘existing and new partners will actively seek out our researchers because of their proven track-record … [and this] strategic engagement will increase our end-user (Category 2-4) income stream’. Imperial College London’s strategy included a goal to ‘diversify funding by engaging with new public and third sector collaborators’. Danish strategies were less specific about business engagement; for example, Syddansk University simply sought to ‘promote research areas that match positions of strength in the regional business community’, and Aalborg University’s ‘carefully selected knowledge-sharing partnerships are based on mutuality and a shared focus’.

            University strategies were also more likely to include civic and/or public engagement. The majority of public engagement in the strategies aimed to provide benefits to society, but often articulated one-way knowledge transfer and communication methods. For example, Goldsmiths, University of London offered a ‘range of short courses, our public lectures and events programme and our library and archives’ to make knowledge widely accessible to their publics. Similarly, Sheffield Hallam and Ulster University proposed ‘marketing’ their research to the public via social media, driving engagement with events and archives. The University of Copenhagen emphasised the importance of schools as key stakeholders, and sought to develop teaching materials and contribute towards curriculum development. Similarly, Lingnan University sought to support STEAM (science, technology, engineering, arts and mathematics) education in primary and secondary schools.

            Some strategies framed public engagement as capacity building, including more two-way knowledge exchange mechanisms. For example, Memorial University of Newfoundland aimed to ‘[build] greater capacity for our external partners and collaborators through public engagement activities’. The University of Calgary described how community engagement actually shapes their research, as they have a: ‘responsibility to engage our communities that we serve and lead in discussions about important issues where evidence is required to better understand those issues or even resolve them. Our community engagement significantly influences our research directions.’

            Strategies also aimed for international partnerships, and these were wide-ranging, including with: business and industry; governments and agencies; UN organisations and other international institutions and convention bodies; cultural organisations; community organisations and other civil society organisations; practitioners; experts and researchers in think tanks and academia; and not-for-profit organisations and philanthropic groups. It was common for strategies to identify partnerships with other universities internationally, but few linked this to impact. The Stockholm Environment Institute did, however, and they explained why: ‘We partner with other knowledge-providers for multiple reasons: to access expertise, to ensure our research is firmly grounded by consulting with local and regional research partners, and to create alliances for achieving greater impact on policy and practice.’

            More rarely, strategies referenced partnerships with funders (for example, Research England, which coordinates the UK’s Research Excellence Framework). The Stockholm Environment Institute explained their reason for including their funders as partners, to help them to identify ‘knowledge frontiers, [provide] scientific approaches to their problems, and [deliver] outputs and results that are accessible and actionable’, but emphasised the need to ‘operate at arm’s length’ to avoid conflicts of interest.

            There was not always a specific strategy to develop partnerships with particular sectors or types of organisation; for example, the University of Wollongong proposed a needs analysis to prioritise future partnerships in relation to the university’s strengths and stakeholder needs. Similarly, the University of Exeter proposed systematically mapping industry needs on a sector-by-sector basis ‘against current expertise and academic hires’ to provide ‘introductions, facilitate initial meetings … and build industrial interactions and engagement’. The University of Auckland proposed hosting industry/stakeholder days to identify stakeholder needs, which they argued would ‘provide opportunities for our local communities and other stakeholders to share with us what impact means to them’. Although partnerships were sometimes focused on seeking funding from external organisations to create new infrastructure or capacity within the university, they were more often focused on delivering benefits to collaborating groups and those they served in society, in some cases referencing the achievement of United Nations Sustainable Development Goals.

            Strategies referenced a number of mechanisms for maintaining and deepening partnerships. For example, the University of Ottawa emphasised networking events, with a knowledge mobilisation (KMb) hub to facilitate networking and identify networking events and consultations run by their stakeholders in which researchers could participate, alongside the provision of tools and services to facilitate networking. The University of Queensland had a Special Studies Program to encourage researchers to engage with industry, government and the wider community, and funded industry placements for research students. The University of Lincoln funds ‘staff placements in, and exchanges with, industry to ensure that our … research is relevant and up-to-date’. The University of Glasgow had ‘outcome agreements’ with the Scottish Government ‘to support their ambitions in maximising the opportunities afforded to the economy through the exploitation of research undertaken by our world-renowned academics and scientists’. King’s College London proposed ‘a civic engagement programme that will deliver a coordinated approach to student volunteering, credit-bearing modules in partnership with local community organisations, and an annual Civic Challenge’. The University of Portsmouth strategy talked about its ‘portfolio of CPD [continuing professional development] programmes for government, industry and third-sector partners in response to emerging workforce needs’. The WISERD project used an annual conference and ‘evidence symposia across key themes of policy and academic interest’. Calgary University suggested ‘pitch competitions on thematic problems [to] exchange information with external partners’. Hong Kong University of Science and Technology aimed to ‘promote community service: by raising civic awareness and social entrepreneurship and setting up an ecosystem to support social enterprise start-ups’ and by reaching out ‘to more students and faculty and [encouraging] them to actively participate in community service programs’. Similarly, Roskilde University and the Technical University of Denmark taught their students to become ‘change agents’, to develop businesses and to ‘take leadership of sustainable change’. Aalborg University used: ‘problem-based learning as a valuable means to establish successful collaboration relationships between our surrounding community and our students. This approach enables the students to gain practical knowledge while the business community gains an insight in the most recent theoretical knowledge.’

            Finally, there was a strong emphasis on partnerships with Indigenous populations and their representative bodies across the Australian, New Zealand and Canadian strategies, with commitments often positioned prominently. In many cases, these were framed as acknowledgements, ‘paying respects’ to Indigenous elders, knowledge and land, and they were not all linked to research impact. Where impacts were identified, they tended to focus on ‘Indigenous advancement’, ‘providing opportunities’, ‘reconciliation’ and ‘empowering’ Indigenous staff and students. Some, however, made explicit links to research and impact, for example the University of Wollongong included: ‘embedding Indigenous principles and practice … into our research processes. A holistic and inclusive institutional view of impact along with academic leadership will also safeguard against instrumentalising university research (directing research solely towards utilisation or political priorities).’ Massey University provided ‘support for and facilitation of mentoring networks designed to support researchers working in Pasifika research and development, especially those working in partnership with external stakeholders and Pasifika communities’. Queen’s University, Canada ‘support the diversity of perspectives across First Nations, Métis and Inuit communities, while working with the Indigenous research community to examine how Indigenous ways of knowing impact research across the university’. They did this through a Principal’s Implementation Committee on Racism, Diversity and Inclusion, community-based participatory research partnerships with Indigenous groups to examine issues of mutual interest, and recruiting more Indigenous scholars through initiatives such as the Queen’s National Scholar Program and the Canada Research Chairs Program. Similarly, the University of Regina aimed to:

            build and strengthen our relationships with urban, rural, and remote Indigenous communities with an aim of accountable and reciprocal research; enhance Indigenous engagement in the research enterprise … ; [and] enhance professional development opportunities and supports for units and faculties to learn to Indigenize and decolonize pedagogy, curricula, policies, procedures, and processes.

            Co-production and boundary organisations

            Although co-production and boundary organisations are closely related to engagement and partnerships, some strategies took a distinct approach that moved beyond engagement. Many of the strategies discussed more interactive co-construction of knowledge with partners and stakeholders (Campbell and Vanderhoven, 2016), where researchers work in partnership with knowledge users. Co-production has been embraced ‘because of its potential to improve the quality and relevance of research and its effect on policy and practice’ (Redman et al., 2021: 1), and this was a stronger theme across more specialist units. Given their focus, they tended to provide more detail about specific groups and organisations linked to the organisation’s strategic impact goals and scale of operation. For example, the Institute for Development Studies focused on learning partnerships which enable them to learn from their stakeholders to better understand the contexts in which they produce knowledge, so they could co-produce more relevant findings:

            Learning partnerships enable us to better understand the environment in which development happens and map out desired changes, key stakeholders and policy processes. Achieving impact means not just producing evidence, but engaging with the politics of knowledge – who it is produced by and for, and whose voice counts …

            The Stockholm Environment Institute went on to describe some of the co-production methods they used: ‘We often build engagement into research, through methods such as citizen science or participatory scenario development, co-production processes and workshops and dialogues of different types, as well as through tools and platforms that users can work with independently.’ The Plymouth Marine Laboratory strategy provided support to researchers to identify stakeholders early in the research process so that they can co-develop proposals together. CGIAR took this a step further to propose work to get ‘greater strategic clarity on where CGIAR lies in the development landscape’, in order to match stakeholder partnerships to challenges, building in ‘greater diversity in the range of research and scaling partners’. They went on to specify three types of partnership they wished to target: (1) ‘partnerships along the impact pathway … to co-deliver on innovations in technology, institutions and policy’; (2) ‘partnerships with the private sector’; and (3) ‘multi-stakeholder platforms’, which they describe as ‘structured alliances of stakeholders from public, private and civil society convened in the international development community to address complex global problems enshrined in the SDGs [Sustainable Development Goals], with CGIAR participating in those whose architecture and activities are best designed to link global policy and local action, and whose actions are informed by research’.

            Some of the university strategies also emphasised co-production of knowledge, but they rarely considered how this would be done. One exception was the University of Glasgow, which, in addition to providing training for community and public engagement, sought to ‘identify innovative and effective models of community engagement and co-creation from within the institution and provide mechanisms through which these models can be shared, adapted and adopted’.

            Universities did, however, regularly work with boundary organisations, brokers, intermediaries and boundary spanners (for definitions, see Neal et al., 2021). The most common approaches were focused on interfaces with industry via innovation precincts/districts, science parks and co-location facilities (see above). In the creative sector, Norwich University of the Arts positioned its East Gallery NUA as a boundary organisation designed ‘to develop a formal network of partner galleries across the UK and Europe with the potential to host collaborative exhibitions, it becomes a key resource for NUA academic staff and PGRs [postgraduate researchers] as well as external colleagues as a forum through which research activity can be organised and disseminated’. The University of British Columbia’s KMb Unit employed brokers to operate ‘as a gateway of access between UBC and other communities/general public … [It] brokers partnerships between communities at UBC and communities outside UBC and operates as a connector for currently existing knowledge mobilization groups and personnel, enhancing cross-pollination, synergy and innovation.’ Some strategies mentioned working with organisations and networks that could connect them with other universities working on impact or public engagement, as well as with stakeholders, for example, Research Impact Canada and the UK’s National Co-ordinating Centre for Public Engagement.

            In contrast, specialist units were more likely to position themselves as boundary organisations. For example, the ClimateXChange Centre of Expertise was designed to act ‘both as a knowledge broker between researchers and policy, and as a research provider’. It operates in two modes. First, it ran a research programme that is co-produced with the Scottish Government and its agencies ‘to respond to questions and requests for evidence, identify upcoming evidence needs, and then independently plan our research and analysis to meet policy timelines’. Second, it provided a knowledge brokerage service to ‘facilitate conversations and broker knowledge across sectors, disciplines and institutions to provide new insights for policy’. Similarly, CGIAR engaged in ‘innovation systems’, which they described in an early draft of their strategy as follows: ‘The concept of an innovation system stresses that the exchange of technology, information and other inputs among people, enterprises, and institutions is key to an innovative process. Within innovation systems, there are no hard boundaries where CGIAR stops and starts.’

            Resourcing for impact

            Alongside the externally focused engagement and co-production strategies was a set of more inward-looking activities designed to support researchers from inside the organisations, and to improve capacity and capability for delivering impact. The majority of university strategies included resources for dedicated professional services impact staff to provide this support. These teams were typically separate from commercialisation teams or other knowledge exchange departments which may have a more external focus (Ward, 2017), although both clearly worked closely together. Although implicit in most strategies, it was clear that commercialisation teams and organisations such as Warwick Ventures and Oxford University Innovation focused on industry engagement and economic impacts, while broader impact teams focused on other sectors and types of impact. These teams provided a range of functions, including:

            • helping researchers develop pathways to impact (including for funding applications)

            • running events, including stakeholder workshops and networking events

            • administering impact funding (see below)

            • administering systems and providing training in impact monitoring and evaluation

            • regularly reviewing impact capabilities in relation to their strategy

            • monitoring evolving external agendas to recommend appropriate responses

            • sharing good practice in impact generation

            • coordinating impact generation-related activities across the institution

            • coordinating and delivering training

            • providing or facilitating specialist support for engaging with industry and policy

            • identifying and managing impact case studies

            • providing impact stories to communications teams and press offices

            • creating and managing online resources for researchers to generate impact, in particular, impact toolkits, guides, training resources and case studies.

            Some activities which crossed over more directly with the commercialisation or knowledge exchange teams included:

            • managing relationships with stakeholders and alumni (in some cases using Client Relationship Management systems)

            • providing a gateway and/or clearly identified contact points for external organisations.

            Another common approach was to appoint ‘impact champions’ (these featured in strategies from the University of Wollongong, La Trobe University, the University of Auckland, the University of the West of England, London Metropolitan University, Ulster University, the Chinese University of Hong Kong and the Glasgow Institute of Health and Wellbeing). The champion role included: helping build skills and approaches to impact relevant to the discipline or unit; sharing the knowledge of how to translate and communicate research impact within the discipline or unit; helping prioritise effort and resources for impact; recognising and celebrating impacts occurring in their area; sharing good practice with other champions across the university; and, in the case of London Metropolitan University, being responsible for actually generating impact for research centres. Champions were responsible for these tasks at the scale of departments, schools or faculties, with the goal of providing insights into the unique challenges and opportunities faced by colleagues in their disciplinary area. With the exception of the University of Auckland, which specified both academic and professional services champions, the role was either academic or not specified in the strategy.

            A number of universities ran internal impact funding schemes to generate new impacts or to extend existing impacts, and in the UK and Australia, in particular, also to facilitate evidence collection activities. The scope of the Warwick Impact Fund (University of Warwick) was particularly broad:

            Internal investment will be available to support a range of innovation activities, from industry–University secondments, industrial fellowships in areas of relevance to the Government’s Industrial Strategy, collaborative research projects with practitioners in the public and private sector and with creative industries, ‘industry engagement days’, and ‘proof of concept’ and commercialisation funding.

            Funds in other universities were more specialised, for example, the University of Edinburgh’s College of Arts, Humanities and Social Sciences ran ‘a venture fund for investment in commercial exploitation of research (through external partnership with Arthurian Life Sciences), [with] strong links to the SET Squared innovation and business incubator, of which we are a member’. In the UK, availability of these funds may depend on an institution’s eligibility for government funding from UK Research and Innovation via the Higher Education Innovation Fund, or Impact Acceleration Accounts from the UK’s disciplinary research councils.

            Many universities had internal structures to provide accountability and to link the activities of impact teams to wider activity in the institution; for example, the University of Wollongong’s strategy recommended the creation of an ‘impact and engagement steering committee reporting to the University Research Committee … compris[ing] senior research staff, relevant support staff and impact champions’. The University of the West of England had a similar structure designed to provide ‘clear reporting lines for research impact within faculties and between faculties and the university’. This was often operationalised through networks of academic and professional services staff with specific responsibilities for impact, including departmental or school impact officers and facilitators from professional services, and academic coordinators and directors of impact or impact champions.

            In contrast to the emphasis of university strategies on impact teams, roles, organisational structures and internal funding schemes, impact strategies from more specialised units were more likely to provide training and support and facilitate best practice at the scale of individual research projects or teams, or to integrate across projects thematically to achieve economies of scale in specific impact domains. For example, CGIAR’s strategy included a mechanism to ‘build a shared portfolio of research for development based on pooled funding … [to] achieve the levels of partnership required to scale impact, and attract the best minds to the challenges facing our food, land and water systems’. There was limited information on resourcing and structures for impact in Danish and Hong Kong strategies.

            Impact training

            Training was the main alternative (or supplement) to impact support in the strategies reviewed, aiming to build ‘impact literacy’ (as the University of Auckland called it), presumably referring to Bayley and Phipps’s (2019) framework for understanding: what impacts happen, for whom, and how researchers can demonstrate it; how research can be mobilised into action; and who is needed, with what skills, to make that happen. In addition to the necessary knowledge and skills to underpin an impact culture, some strategies mentioned the need to change perceptions of impact, for example, ‘socialis[ing] the importance of KMb for research impact among graduate students and faculty’ (the University of British Columbia), ‘promot[ing] the benefits of designing research to enhance impact’ (the University of Auckland) and providing opportunities to connect with researchers already engaged with impact via ‘networks to connect those who are interested in impact or who are engaging in impact activities already’ (the University of Auckland). The University of the West of England and the University of Calgary proposed going beyond a network to establish an impact ‘community of practice’, presumably referring to Wenger-Trayner and Wenger-Trayner’s (2015: 2) concept of a group of people who ‘share a concern or a passion for something they do and learn how to do it better as they interact regularly’.

            As a comparatively new research skill, building capacity for research impact through training is a standard approach, so it is perhaps surprising that impact training was only explicitly included in a minority of strategies (17 out of 77 reviewed). In some cases, these were passing mentions, or restricted to certain skills or group. For example, the University of Queensland highlighted communications training for graduate researchers, and the University of Wollongong, La Trobe University, Manchester Metropolitan University, Memorial University of Newfoundland, Plymouth Marine Laboratory and the WISERD project included mentoring or coaching programmes. Although few strategies provided details, topics and skills covered in training programmes included:

            • foundations and principles

            • planning and designing research for impact

            • building and maintaining stakeholder relationships

            • tracking impact pathways

            • measuring impact success and evidencing impact

            • communication and dissemination, including media and online engagement skills

            • public engagement

            • public policy impact

            • ethics of research and impact

            • entrepreneurial skills for researchers and commercialisation routes

            • co-production skills.

            The University of Wollongong’s training on co-production sought to ‘understand and recognise relevant forms of evidence (including those outside traditional forms valued solely by academics)’. They also emphasised the need to take a coordinated approach to training, revisiting a ‘researcher development needs analysis’, and identifying training priorities at school and faculty levels. A number of strategies included the development of online toolkits, guides, training resources and case studies, including definitions, templates and signposting to specialist sources of support. As might be expected, there were often links between the content of these published resources, training and the functions for which impact teams were responsible (see previous section); for example, all three covered monitoring and evaluation of impact.

            The independent research institutes tended to take a different approach to impact training compared to the other organisations and units in the sample. While recognising the importance of building capacity in their research community, their focus was on training and building capacity among stakeholder communities, for example, training them in the use of decision-support or analytic tools arising from research or in the use of collaborative data platforms (Stockholm Environment Institute), and ‘targeted capacity development … ranging from training-of-trainers at the farmer level through to ongoing institutional support to national partners … and technical advice to policy-makers at global level’ (CGIAR). The only parallel to this in the university strategies was the Memorial University of Newfoundland, which sought to build stakeholder capacity for engaging with researchers, and the University of British Columbia, which ‘trains and builds capacity for research utilization in partnerships with different stakeholders’.

            Monitoring and evaluation

            Few of the strategies included key performance indicators or success measures (16 out of 77), or any more detailed information about monitoring and evaluation of impact (15 out of 77). Of all the jurisdictions analysed, Australian strategies were most likely to mention the word ‘evaluation’, but only 3 strategies from this country contained substantive monitoring and evaluation activities (described below). However, 5 out of the 12 specialised units, programmes and institutions had a strong focus on monitoring and evaluation, and these were most likely to use impact planning tools, such as theory of change and logic models.

            Strategy implementation was included in some strategies via a theory of change (5 out of 77) or logic model (or similar) (4 out of 77) to visualise or tabulate actions, assigning responsibilities and deadlines. For example, Massey University visualised their implementation plan as a matrix (Figure 1), the University of South Wales used a simple logic model, and N8 AgriFood drew on logical framework analysis to plan and track progress towards their impact goals (Figure 2). The N8 AgriFood logic model included assumptions, whereby risks might be identified. Generally, however, there was limited consideration of risks across the strategies reviewed. Notable exceptions were Norwich University of the Arts and the Third Sector Research Centre, which both considered a range of risks in their strategies. Both focused primarily on non-delivery risks, while the University of Auckland and the University of British Columbia reported on staff surveys which also highlighted risks of negative unintended consequences arising from impact generation activities (incidentally, the University of British Columbia was the only strategy to mention the need for more research on impact). Plymouth Marine Lab included an implementation timeline with milestones linked to a theory of change (Figure 3). In others, reference was made to committees responsible for overseeing progress towards strategy goals (for example, see the committee structure proposed by the University of Wollongong in the ‘Resourcing for impact’ section above).

            Figure 1.

            Implementation plan for Massey University’s strategic plan, including enterprise and knowledge transfer (KT) in the far-right column (Source: Massey University, Research Strategy 2018–2022)

            Figure 2.

            Examples of logic models from the University of South Wales (top) and N8 AgriFood (bottom) (Sources: University of South Wales, Research Strategy 2018–2028; N8 AgriFood Theory of Change and Logic Model)

            Figure 3.

            Theory of change (top) and implementation plan (bottom) from Plymouth Marine Lab (Source: Plymouth Marine Lab Research Impact Plan 2020–2025)

            Table 2 shows the most common indicators used to track progress towards strategic impact goals. The most frequently cited indicator was income from consultancy and industry (used in 11 strategies). These mainly came from the UK and Australia, which might reflect the inclusion of industry funding as an indicator in Australia’s Engagement and Impact Assessment and in the UK’s Knowledge Exchange Framework. There were some nuances in the indicators included in this category; for example, the University of Liverpool aimed to achieve specific numbers of new funded strategic partnerships with industry, and the University of Queensland aimed to ‘rank first in Australia for attracting research income from industry’. Whereas this indicator focused specifically on funding from industry, 3 strategies also included indicators based on funding for impact from any source.

            Table 2.

            Key performance indicators and success measures used to track progress towards strategic impact goals, ranked by the frequency with which they appeared in strategies (Source: Authors, 2022)

            Impact indicatorFrequency
            Income from consultancy and industry11
            Numbers and/or proportion of high-scoring impact case studies (including changes in impact rankings based on case studies)6
            Attendance figures for training events and evaluations of training and impact resources5
            Number of funded impact projects (including impact funding from government; for example, Higher Education Innovation Fund in the UK)4
            Number of press releases and/or media stories featuring research impacts4
            Impact key performance indicators in appraisals met and promotions due to impact2
            Applications for internal impact awards2
            Impact monitoring established with database2
            Changes in government policy (or citations in policy documents) resulting from research2
            Number of stakeholder engagement activities delivered2
            Number of requests for impact support received2
            Positive researcher attitudes towards impact and support services2
            Changes in audience awareness and/or attitudes resulting from research2
            Proportion of publications co-authored with non-academic partners1
            Number of staff engaged with staff placements in, and exchanges with, industry1
            Presence and representation on relevant boards and bodies1
            Number of spin-out companies1
            Number of impact opportunities identified, planned and realised1
            Customer Relationship Management system established1
            Scholarly publications arising from institutional support for impact1

            Although only 2 impact strategies included monitoring and evaluation in their success measures, 15 strategies included material on this topic elsewhere in the document. In many cases, this pertained to the investigation of options for monitoring systems (for example, La Trobe University and the University of Auckland) or aspirations to ‘[develop] processes to capture, collate and celebrate our impact’ (Sheffield Hallam University). Investment in impact monitoring and evaluation was often linked to research assessments and the generation of case studies in the UK and Australia, and in other cases, it was sometimes linked to the generation of annual reports on impact. Some mentioned specific tools, such as Vertigo Ventures Impact Tracker (University of Wollongong), and aimed to increase the use of these tools by researchers. However, it was recognised that impact evaluation was likely to require additional input. For example, the University of Wollongong suggested that despite their investment in the impact tracking software:

            without appropriate support mechanisms in place, researchers will be unable to adequately capture or quantify impact. Some of this could be achieved through evidence gathering support from research assistants or professional staff, but in some cases specific expertise (e.g. interviews or surveys) or analytics support (e.g. website demographic analysis) may be required.

            As a result, some strategies also aimed to increase capacity for impact evaluation among researchers, for example, developing and promoting the use of specific evaluation tools (University of Ottawa). Independent research institutes were the only organisations who employed independent external impact evaluators, partly in response to requirements from their funders. The Stockholm Environment Institute (SEI) had a strong culture of learning around monitoring and evaluation:

            SEI has a scheme of learning activities to ensure that the monitoring and reporting on outputs and outcomes from research activities are fed back into the organization … We regularly assess our effectiveness in achieving outcomes, capturing key success factors and the dos and don’ts of, for example, stakeholder engagement.

            Plymouth Marine Laboratory’s approach to monitoring and evaluation was also particularly comprehensive, including: the identification of information gathering points at key project stages, reporting structures, identification of impact indicators at project proposal stage, monthly science area reports providing updates on impact and potential case studies, annual impact reporting for all projects, systems to review all new publications for their impact potential, training for staff on monitoring and evaluation, maintenance of an impact database, timely requests for testimonials from research users, proactive case study monitoring and support, and regular monitoring against the organisation’s theory of change (Figure 3) and impact performance indicators. There was limited attention given to monitoring and evaluation in New Zealand, Hong Kong and Danish strategies, compared to the other jurisdictions in the sample.

            Impact culture

            Pressure on research productivity from assessment, precarious contracts, increased competition for research funding and a focus on outputs at the cost of all else has led to a recent rise in interest in ‘research culture’ or, as the Royal Society (n.d.: n.p.) defines it, ‘the behaviours, values, expectations, attitudes and norms of our research communities’. Work by the Wellcome Trust (Moran et al., 2020) and the Association of Research Managers and Administrators (ARMA, 2020) in the UK considers factors in research activity that can give rise to ‘disruption’ of research, and how this is manifested in ‘poor’ research culture. Wellcome deduced that ‘Factors identified as disruptive to research culture included chasing impact, increased competition, proliferation of metrics, job insecurity and rigid career pathways’ (Moran et al., 2020: 1). The UK Government released an R&D People and Culture Strategy in summer 2021 (Department for Business, Energy and Industrial Strategy, 2021), which aims to ensure that ‘everyone’s contribution is valued, and the UK has an outstanding research culture that truly supports discovery, diversity and innovation, and offers varied and diverse careers that bring excitement and recognition’ (Kwarteng, 2021: n.p.), although most actions are currently reviews and consultations.

            Many of the institutional strategies we examined explicitly considered research culture, and, as a subcategory within this, much of this content could be implied to apply to impact culture. For example, much of the content in strategies about research ethics, responsible research and innovation, open access policies, equality, diversity and inclusion, and staff health and well-being would apply to impact. For the purposes of this study, however, only material pertaining specifically to impact culture was analysed.

            First, it should be noted that many references to culture were non-specific and ill-defined, and in reality were probably referring to an ‘approach’ rather than to a culture, if culture is understood in relation to how people ‘find meaning as individuals (on the basis of their own perceptions), collectively (on the basis of social norms and shared perceptions) and through their relationship with objects’, or if impact culture specifically is understood as ‘communities of people with complementary purpose who have the capacity to use their research to benefit society’ (Reed and Fazey, 2021: 2, 4). For example: La Trobe University aimed to ‘develop a culture that values and generates impact from research through industry engagement’; Queen’s University, Canada, stated that their ‘culture of sustainability will be underpinned by our commitment to the United Nations Sustainable Development Goals’; the University of Lincoln aimed ‘to promote a culture of enterprise and innovation across our communities’; Anglia Ruskin University aimed to ‘deliver a culture and working environment in which collaborative and multi-disciplinary research and innovation thrive’; and London Metropolitan University aimed to ‘embed impact and knowledge exchange within our research culture’. Very rarely did these and other strategies like them define what they mean by culture or explain how they would achieve their cultural goals.

            Impact culture was often associated with values, for example, the Hong Kong Polytechnic University aimed to ‘align departmental and individual performance measures to better reflect a congruent value system that properly addresses the balance among education, research and community impact through KT [knowledge transfer] and entrepreneurship’. The Brunel University strategy described a ‘culture of mutual respect and equality of opportunity, placing the health and welfare of individuals at the heart of our ethos’. The University of Edinburgh described ‘a shared culture that values people and provides leadership within a supportive working environment built upon collaboration, communication and coordination’. Queen Mary University of London sought to ‘build on our core values to further embed a culture of engaged research practice’. The Stockholm Environment Institute provided one of the richest descriptions of an impact culture rooted in clearly articulated values and principles:

            Our organizational culture lays the foundation for the way we work with partners and with each other. Our culture is grounded in our development ethos and commitment to resolving sustainability challenges, from local to global. It stands for transparent and inclusive decision making, for building and maintaining trust, for empowering our partners, for giving space to diverse voices, and for delivery of the highest quality. It holds us to ethical standards of integrity, collegiality and respect in all our professional interactions.

            Hong Kong University’s strategy sought transformational change in its culture to deliver impact: ‘We will make a paradigm shift to focus on and reward … research innovations that benefit communities and transform global technologies … There will be a shift from activity to value, from output to outcome, and from strength to leadership.’ To do this, they proposed to:

            provide more opportunities for outcomes-driven translational and transformational research; support research that transcends intellectual output to meeting an innovative outcome that has value and impact, and driven by societal needs or enterprise; [and] deliver demonstrable and significant outcomes to our social communities and the technological world through research, innovation and enterprise development.

            Aligned with research culture and values is the issue of intrinsic and extrinsic drivers of behaviours. Extrinsic drivers are those where external demands or incentives provide motivation for researchers and organisations, while intrinsic drivers build on personal values and a fundamental self-motivated desire to meet the needs of society (Ryan and Deci, 2000). These drivers are the building blocks of research and impact culture.

            UK strategies were more likely than strategies from other countries to make explicit links between impact and formal assessments of research and impact (16 strategies, compared to 2 in Australia, 1 each in New Zealand and Canada, and none in Denmark; 2 of the Hong Kong strategies referenced the Research Assessment Exercise, with 1 referencing the UK’s system as a benchmark). Although these represent a minority of UK strategies, links to the REF were particularly prominent in some of these. For example, the REF was mentioned in the first point in the first main section of Manchester Metropolitan University’s strategy, with the document later implying that investment in research for impact was typically dependent on a return on investment via quality-rated (QR) funds from the REF: ‘Internal resources will be directed at research that meets our ethical standards and usually: a. Generates academic outputs of sufficiently high quality to attract external income … and b. Generates beneficial social, economic, environmental or cultural impact (sufficient to attract QR funding).’ Similarly, two out of six criteria for the establishment of new Research and Knowledge Exchange Centres were linked to REF performance, key performance indicators were due to be linked to the REF (similar to a number of other institutions – see Table 2) and workload allocations across the institution were linked to the REF: ‘Faculties will set a specific target for the proportion of workload allocation directed at 3* and 4* work and measure against it as a lead indicator of progress with the research strategy.’

            A total of 17 out of the 77 strategies reviewed included various kinds of extrinsic incentives designed to increase researcher engagement with impact (Table 3). These were largely absent from the five independent research institute strategies, and they were not present in Danish strategies. There were examples of impact being included in promotions exercises in the UK, Canada, Australia and New Zealand (note that this includes the University of Auckland, where inclusion in promotions was under consideration only). The need to ‘recognise’ impact generation activities was noted in two of the Hong Kong strategies, but no mechanisms were proposed (for example, Hong Kong Polytechnic University aimed to ‘enhance the benefit-sharing policy and recognition mechanism to encourage academic staff to identify and pursue high-impact KT endeavours’). Impact was only included in annual appraisals in 4 strategies (2 each from the UK and Australia), and it was only included in recruitment criteria in 2 strategies (both from Australia). Although there were general comments about the need to recognise and celebrate impact across all six jurisdictions and the independent research institutes, formal impact award ceremonies were only mentioned in 1 Australian and 1 Canadian strategy.

            Table 3.

            Extrinsic incentives designed to increase researcher engagement with impact, ranked by the frequency with which they appeared in strategies (Source: Authors, 2022)

            IncentiveFrequency
            Inclusion of impact in promotion criteria9
            Less formal recognition and/or celebrations of impact (including impact showcase events)7
            Inclusion of impact in annual appraisal criteria4
            Inclusion of impact in recruitment criteria2
            Research impact awards2
            Inclusion of impact in workload allocation models2
            Financial bonuses1

            In two universities that did not yet offer formal rewards for impact, strategies included findings from staff surveys that highlighted issues arising from the lack of incentives. For example, the University of Auckland strategy noted that, ‘Many of our researchers are already conducting impactful research but are often not recognised or rewarded for this, or their work is perceived negatively’, and the University of British Columbia highlighted a number of issues linked to a lack of incentives for impact, including the recognition that impact is ‘something Professors do off the side of their desk’ and ‘not yet part of the tenure and promotion discussions [or] recognized in most faculties’.

            There was limited consideration of the potential negative unintended consequences of providing extrinsic incentives for impact. One notable exception to this was La Trobe University’s strategy, which emphasised the need to reward impact without disadvantaging non-applied researchers, recognising ‘that there are researchers undertaking pure or fundamental research that may not yield “real-world” change in the immediate future but will profoundly influence the course of knowledge and the ability of other researchers to achieve future impact’.

            In contrast to these extrinsic incentives to promote engagement with and reward impact, strategies also sought to engage with the intrinsic motivations of researchers. In an oblique reference to the limitations of incentivising impact via research assessments, the University of the West of England was ‘keen to ensure that a research impact culture extends beyond the REF and that as much of our research as possible is focused on being of value to society’. Connecting with the intrinsic motives of their researchers to innovate to tackle real-world challenges, the Stockholm Environment Institute (SEI) describes itself as:

            a trust-based organizational culture, and our people breathe life into and carry out our mission. We put high levels of confidence in our colleagues around the world, which enables SEI researchers to innovate, take initiative, and engage with key arenas of decision making. SEI is innovative and adaptive in order to respond to new challenges.

            Discussion

            The findings from our analysis showed a degree of consensus about the activities and approaches required to develop research impact, although there were notable distinctions between strategy documents from the universities and the more specialised institutions or sub-units (see the pre-print version of this paper at www.fasttrackimpact.com/impactstrategies). Specifically, two broad types of impact strategy emerged from the thematic analysis: those that were focused primarily on: (1) enabling impact; or (2) achieving impact (Table 4).

            Table 4.

            Themes from the qualitative analysis that tended to be associated with strategies that primarily sought to enable rather than achieve impact (both types had similar approaches to training) (Source: Authors, 2022)

            ThemeEnabling impact strategiesAchieving impact strategies
            Partnerships and engagementPartnerships with organisations within the local region and at national and international scales, more likely to include civic and/or public engagement and mechanisms for working with Indigenous groupsPartnerships and engagement with specific groups and organisations linked to the organisation’s strategic impact goals and scale of operation
            Co-production and boundary organisationsWork with boundary organisations to co-produce research for impactOften are (or aspire to be) boundary organisations, responsible for driving co-production with specific methods or approaches adapted to their stakeholders
            Resourcing for impactMore likely to have dedicated impact teams, roles, organisational structures and internal funding schemes operating at institutional scalesMore likely to provide support and facilitate best practice at the scale of individual research projects or teams, or to integrate across projects thematically to achieve economies of scale in specific impact domains
            Impact trainingImpact training appeared in both types of strategy, with skills adapted to the disciplinary contexts of researchers, for example, policy engagement skills for researchers working in policy-relevant fields, and the operational context of the organisation, for example, international development researchers and those working in civic society were more likely to train and empower stakeholders
            Implementation, monitoring and evaluationMore likely to include key performance indicators or success measuresMore likely to link monitoring and evaluation to logic models and theories of change to assess progress towards specific impact goals
            Extrinsic/intrinsic impact cultureMore likely to seek improvements in research assessment rankings and link promotions to impact performance, and seek to motivate researchers not naturally aligned with impactMore likely to recognise and celebrate impact less formally, drawing more on the intrinsic motivation of researchers who already align with the mission and values of the organisation

            Characteristics of enabling impact strategies:

            • They were more likely to be developed and operationalised from the top down, making greater use of incentives to facilitate researcher engagement with impact.

            • They tended to be developed by universities and research institutes to build impact capacity and culture across the institution. They were often integrated as part of a wider research or university strategy which would include values and a mission or set of goals that included impact. Very few of these strategies included an implementation plan.

            • They had a strong focus on partnerships with organisations within their local region, often with an emphasis on industry (for example, via innovation precincts, districts, science parks and co-location spaces) and community connections (for example, via civic and public engagement initiatives, and with a strong emphasis on engaging with and benefiting Indigenous groups in Australia, New Zealand and Canada). These strategies also prioritised partnerships at national and international scales, for example, with government bodies and international organisations. To do this, they sometimes collaborated with boundary organisations to engage effectively across particular sectors or populations at scale.

            • They were more likely to include investment in professional services staff dedicated to impact (whether located centrally, or locally in departments, schools and faculties), and to create academic roles to champion or direct impact within different disciplinary fields, linked to organisational structures to provide lines of accountability and reporting to central committees or leaders.

            • In addition to building skills and capacity for impact through training programmes, these strategies often sought to motivate researchers not naturally aligned with impact, for example, via opinion leaders (such as impact champions) and incentives (such as inclusion of impact in academic promotions criteria).

            • They were also more likely to run internal funding schemes to support the generation of impact, and (in the UK and Australia in particular) the collection of evidence to support impact claims for research assessment purposes.

            • Linked to this, they were more likely to include key performance indicators or success measures, especially linked to income targets and performance in research assessment exercises.

            Characteristics of achieving impact strategies:

            • They were typically constructed and operationalised from the bottom up and co-produced with beneficiaries.

            • They also had a strong emphasis on partnerships and engagement, but they were more likely to target specific stakeholder groups and organisations linked to the organisation’s strategic impact goals (for example, a profession or sector aligned with the organisation’s discipline(s) or mission) and scale of operation (for example, within the country or region of the world in which the organisation is based).

            • They often included methods and approaches designed to enable the organisation to operate as (or become) a boundary organisation, enabling researchers to co-produce research and impact with trusted networks of stakeholders, and respond to changing needs and contexts adaptively. This included, for example: the creation of knowledge brokerage roles within the organisation to connect researchers with specific sectors or communities; stakeholder advisory panels (operating at both institutional and project scales) to provide strategic direction and feedback to researchers; the facilitation of shared learning and innovation spaces, including facilitated workshops, unconferences and other forums, to enable co-production of research and impact; and the use of participatory and co-productive research methods, such as citizen science or participatory scenario development.

            • They were more likely to provide support and facilitate best practice at the scale of individual research projects or teams, with advice and resources tailored to their specific operational contexts. They also integrated across projects thematically to achieve economies of scale in specific impact domains, for example, developing scaling pathways to design, test and pilot initiatives that could generate impact across projects tackling related issues, or building on pilot work to generate impacts at broader scales through joint working.

            • In addition to providing training for researchers, they sometimes also prioritised capacity building for stakeholders to enable them to work more effectively in teams with researchers to achieve impact together.

            • They were more likely to include implementation plans, often using theory of change or logic models to visualise and plan for impacts, including, in some cases, assessments of risks and assumptions, and monitoring against baselines.

            • These strategies recognised and celebrated impact, but they were less reliant on extrinsic incentives such as promotions and awards, drawing more on the intrinsic motivation of researchers who already align with the mission and values of the organisation.

            Enabling impact strategies may be supported by frameworks such as the 5Cs institutional impact health checklist (Bayley and Phipps, 2017) or the National Co-ordinating Centre for Public Engagement’s EDGE tool, which works from similar principles and has a focus on public engagement (NCCPE, 2010). The Knowledge Exchange Concordat (McMillan, 2020) offers ‘8 Principles’, all of which aim to engender an environment conducive to stronger knowledge exchange activity in research organisations. All of these frameworks use a maturity model to allow organisations to assess their stage of development and introduce improved strategies and plans against that benchmarking process. When comparing the themes that emerged from our analysis with these models, it is possible to see significant overlap in some areas (for example, engagement, resourcing and capacity building), but the role of leadership was emphasised less in the strategies we reviewed than it is emphasised in these frameworks. Achieving impact strategies may be supported by frameworks such as theory of change (Mayne 2015), outcome mapping (Earl et al., 2001), logic models or the 7Cs of impact (Sreenan et al., 2019), as these frameworks support more change-oriented planning, and focus more on purpose and mission.

            Many of the university strategies drew heavily on familiar notions of one-way research communication to a generalised public and the potential for commercialisation of new ‘ideas’ through business adoption or spin-offs, although newer forms of more synergistic relationships are being developed through investment in research precincts, co-location and incubators. In Canada, Australia and New Zealand, the focus on their Indigenous communities reflects a coming of age of decolonising research and knowledge that was largely absent in other countries, including the UK, despite its long colonial history. Strategies from these other countries included equality, diversity and inclusion sections, but rarely linked these to research or impact. Instead, they were more likely to refer to research ethics, governance, open science and Responsible Research and Innovation principles. For example, Keele University included a section on ‘research integrity’, and the Stockholm Environment Institute, in its strategy, discussed research and impact in relation to ‘quality, integrity and independence’, ‘ethical practice’ and ‘environmental policy’. However, these issues were only considered in depth in relation to research and impact in a small minority of strategies outside Canada, Australia and New Zealand. In contrast, the Australian and New Zealand Standard Research Classification Review (Ministry of Business, Innovation and Employment, 2020) was cited in some strategies from this region, which provided detailed guidance on engaging with vulnerable and hard-to-reach groups, including guidance for engaging with Aboriginal and Torres Strait Islander, Māori, Pacific Peoples and other Indigenous peoples.

            Beyond the Indigenous focus, the more traditional modes of external engagement miss much of the messy complexity of generating impact from research, which could explain the lack of detailed implementation, monitoring and evaluation in most university strategies. The strategies from the specialist institutes, programmes and sub-units tended to be situated in a more specific context related to their mission, and so were able to reflect a deeper understanding of the multitude of ways in which engaging others in shaping, conducting and applying research can lead to greater impact. The dominant, existing frameworks, and the discourse on impact in most of the countries and universities, fail to acknowledge that not all research is likely to produce outcomes and impact that fall within the research assessment cycles. There is, therefore, the need to bridge the tensions between research impacts that occur on a short-term level, and those that occur on a long-term, accumulated basis. Also, there is ample literature that suggests that ‘co-production of knowledge’ can lead to greater research impact (Armstrong and Alsop, 2010; Redman et al., 2021). What perhaps also goes unacknowledged and needs attention is the value of ‘vernacular expertise’ in the co-production of knowledge. More often than not, vernacular expertise, especially in the case of research in non-English speaking contexts, is often captured through engagement with a vernacular/local language, needing engagement, translation and communication at multiple levels with communities and practitioners – necessitating extra time and efforts, which again tend to fall outside the existing research assessment metrics (Chapman and Schott, 2020).

            As Table 4 shows, the themes that emerged from across the analysis are common in both types of strategy in our typology, and the potential for combining types could be delivered through a nested approach across the scales of the organisation. For example, a research programme may apply an achieving strategy with targeted stakeholder partnerships, clearly identified spaces for shared learning and methods of engagement, and potentially a theory of change or logic model for impact. This programme strategy may be nested within a wider school strategy that could combine approaches that enable impact from the top down (for example, academic champions, communications support) and achieve impact from the bottom up (for example, celebrating impact, integrating research across themes, building stakeholder capacity to engage), and which itself is nested within a broader university-wide enabling strategy (providing professional support and training, access to impact funds, internal accountability mechanisms and impact-aligned mission/vision).

            Drawing on our analysis, we propose the following lessons for developing an impact strategy:

            • Refine ‘impact’ as an idea, which is more inclusive, and reflective of even small measures of relevance to communities that can enhance ‘co-production of knowledge’ in a meaningful way.

            • Decide what type of impact strategy best meets the goals and context of your organisation, considering whether elements of both types might enrich your strategy overall, or for specific sub-units, sectors, beneficiary groups or other contexts.

            • Consider a nested approach to enable you to meet the needs of different levels of the organisation from both the bottom up and the top down.

            • Identify external frameworks that might help you develop this sort of strategy or provide benchmarking.

            • Systematically map stakeholder needs to organisational (or sub-unit) strengths and capabilities, supplementing existing partnerships with programmes of work driven with new stakeholders that emerge from this analysis.

            • Invest in keeping engagement with partners active (for example, via thematic or networking events, relationship managers and/or working with boundary organisations), and ensure it is two-way by building capacity for them to engage effectively with researchers where necessary, integrating partners into research via advisory roles and supporting placements for partner staff and for researchers in partner organisations.

            • Consider whether to invest in becoming a boundary organisation in particular sectors or disciplinary areas, with knowledge brokers proactively reaching out to, and becoming embedded within, stakeholder networks, or if there are existing boundary organisations that could more effectively connect and build trust across relevant networks.

            • Consider the roles and responsibilities needed to achieve the goals of your strategy. Where resources are limited, consider providing, or drawing on, core services accessible to all researchers, including training, monitoring and evaluation tools, event organisation or communications support. Then, invest more proactively and co-productively in strategic areas based on your needs analysis (above), rather than spreading resources so thinly that the depth and sustainability of your engagement suffers.

            • Consider the type of strategy you are developing, and create appropriate implementation and monitoring and evaluation plans. For enabling strategies, be clear about how the organisation will resource and deliver the strategy, and identify relevant indicators, baselines and measures of progress that enable impact-oriented research. For achieving strategies, more detailed implementation plans should be included. Implementation plans may include indicators of both activity and impact, and, if possible, baselines from which progress can be assessed, with individuals and teams given responsibility for managing impacts as they evolve, accountable to others in the organisation. Consider co-producing a theory of change and/or logic model with stakeholders in target sectors or programmes to identify detailed and flexible pathways to impact, and share responsibilities and resources with external organisations where possible.

            • Beware of how far you rely on extrinsic incentives. A strategy that does not include mechanisms to formally recognise and reward impact is likely to send a message that you do not actually value impact as an institution. However, too much focus on extrinsic incentives, especially if explicitly linked to funding and research assessment targets, is likely to instrumentalise impact and drive game-playing behaviours that could be counterproductive and demotivate staff.

            While this last point draws in part on warnings contained in strategies, there were few concrete actions proposed to create impact cultures that drew on the intrinsic motivations of researchers to facilitate their engagement. While impact champions may be effective in some groups, their effectiveness is likely to be strongly determined by the attitudes of the post-holder and the extent to which colleagues consider them to be opinion leaders. (Impact champions with no respect or influence are likely to be ineffective, and influential post-holders who have negative attitudes towards impact may do more harm than good.) The majority of training is focused on knowledge and skills (for example, impact literacy), but to create a ‘third generation’ impact culture (Rickards et al., 2020) that drives systemic change in the way researchers co-produce impact, training needs to create communities of practice where conversations can develop over time to challenge the ontological and epistemological assumptions underpinning both research and impact.

            Based on Reed and Fazey’s (2021) impact culture typology, the majority of university strategies reviewed for this paper were designed to perpetuate corporate impact cultures or research ‘and impact’ cultures. Corporate impact cultures are typically built from the top down via ‘enabling’ impact strategies, and although they can include significant stakeholder buy-in to partnerships and boundary organisations, they can lead to disidentification with the impact agenda and loss of autonomy for those whose identity, values and purpose do not align well with institutional impact narratives (Rosso et al., 2010; Reed and Fazey, 2021). Research ‘and impact’ cultures also tend to be developed from the top down via ‘enabling’ impact strategies, and they tend to relegate impact to an afterthought in an institutional research strategy, either as a rationale or justification for research, or as an end product of research, with limited active engagement or input from stakeholders (Rosso et al., 2010; Reed and Fazey, 2021). Integration of ‘achieving’ impact strategies with these ‘enabling’ strategies (for example, nested in the way described above), or a transition where relevant from enabling to achieving approaches, may be necessary to move towards a more co-productive impact culture, described by Reed and Fazey (2021: 13) as fostering ‘individual autonomy, confidence, and intellectual freedom’, whereby ‘specific impact goals are co-produced through active relationship and dialogue with stakeholders as a primary consideration in research’. As such, it may be necessary to pay more attention in future impact strategies to:

            • how research is conducted, considering discipline-specific mechanisms to increase the rigour and ethical basis of ‘responsible research and innovation’, and encouraging researchers to move beyond studying problems to start researching solutions in more action-oriented and co-produced programmes of research

            • how impact interacts with the intrinsic motivations of different researchers, shaping their individual sense of purpose, and the meaning they derive from work, and the emergence of groups with shared purposes that can be deepened through engagement with impact, even if generating impact is not itself part of their purpose (for example, considering how impact-generation opportunities might combine with new research opportunities to facilitate curiosity-driven enquiry along pathways to impact)

            • strategic approaches that enable bottom-up culture change, driven by researchers with their stakeholders, enabling multiple impact subcultures to develop among complementary communities of researchers and stakeholders, which are porous and dynamic, enabling these communities to work together where their needs and interests intersect, as they build trust and connection, and attend to the role of social norms and power

            • the kinds of capacity that are needed to enable action-oriented research, discovery of shared purpose and community building around impact, including skills, resources, leadership, and strategic and learning capacity.

            Although there were many excellent examples of impact strategies in our sample, we have identified four exemplar strategies to illustrate good practice in ‘achieving impact’ and ‘enabling impact’ strategies (the full title of each strategy is in Table 1, and the full text of these and the other strategies we analysed can be found in a database at www.fasttrackimpact.com/impactstrategies):

            • Achieving impact examples of good practice:

              • Dedicated impact strategy: Plymouth Marine Lab

              • Whole institute strategy: CGIAR

            • Enabling impact examples of good practice:

              • Dedicated impact strategy: University of Wollongong, Australia

              • Whole university strategy: King’s College London.

            The impact strategies we assessed may well already be under review or revision, and new documents are produced regularly. A productive area of further research would be how these strategies have or have not been implemented and the changes they have effected so far, bearing in mind the inevitable ‘implementation gap’ (Derrick and Nickson, 2014) that will be seen. Alongside implementation assessment and evaluation, there is also an opportunity for organisations to move beyond the standard enabling approaches towards putting in place mission, purpose and leadership that can achieve more effective impact outcomes, in addition, recognising that valuable research output and impact, which in turn affects university rankings, is contingent upon researchers themselves feeling valued by the universities. Mechanisms to recognise, award and incentivise research and researchers alongside improvement in working conditions (for example, the recent industrial action in universities in the UK) can go a long way in shaping the university of the twenty-first century.

            Conclusion

            Our methodology found strategic documents from universities with substantive goals and activities relating to impact in six jurisdictions, in addition to a number of independent research institutes from around the world. To test for a bias towards English-speaking jurisdictions in our sample, strategies were identified for all universities in Denmark and Hong Kong, China, and all but one included substantive goals and activities relating to impact. This may indicate that there are missing impact strategies in the other jurisdictions included in the sample, which could be identified in future research via the systematic collection of strategies for all universities in each jurisdiction.

            It is clear that more research is needed, but by showing for the first time how different types of institutions and countries are strategising impact, we have provided evidence to underpin the development of a novel impact strategy typology. This is the first time that such a typology has been proposed, and this is significant for two reasons. First, it provides a fine-grained understanding of the components of impact strategies, providing research managers with a wealth of options for consideration as they develop and enhance their own impact strategies. Our analysis provides insights into a new and rapidly evolving field of professional practice across the international higher education and research sectors, showing the very different approaches that are being taken by research organisations to build capacity and plan for impact in response to research funders and assessments.

            Second, this snapshot of impact strategies around the world may also provide insights into the ways in which research organisations are reorienting and, in some cases, repurposing themselves to deliver impact as their core mission. The two types of strategy described in this paper are not mutually exclusive, and some strategies contained elements of both enabling and achieving impact. Each type of strategy has unique strengths, and by defining these clearly, we hope that our analysis will be used to increasingly combine best practice from each approach. In so doing, future impact strategies may be able to provide clear structures, roles and accountability for impact across large organisations, while facilitating more co-productive approaches to research and impact within and between projects. It may be possible to establish more specific and measurable impact goals and targets, while creating credible implementation plans that consider assumptions and risks, both for the delivery of impact and for unintended consequences. They may be able to harness the intrinsic motivation of some researchers around mission-focused engagement, while incentivising and rewarding engagement more widely, and paying attention to the potential negative outcomes sometimes associated with extrinsic incentives for impact.

            Universities have a critical role in shaping society and the world. More and more higher education institutions are recognising the need for altering research praxis and impact that is inclusive, emancipatory and transformative. There was a strong emphasis on Indigenous rights, and the need to embed research principles and practice within an Indigenous ethos, to safeguard against instrumentalising university research in Australian, Canadian and New Zealand strategies. However, the commitment for a more inclusive and transformative approach, in other words decolonial praxis at all levels, was largely absent from strategies in the UK, despite its colonial history. The more progressive strategies – both enabling and achieving – recognised that research is not conducted or applied in a void. They acknowledged that through building two-way relationships with external stakeholders – public, industry, policymakers and so on – research may be co-produced to fill knowledge gaps, while delivering outcomes that are needed and prioritised by local/civic communities, the public and stakeholders.

            Impact strategies have the potential to articulate goals and implement activities to enable research to develop credible and relevant solutions to problems, increase the effectiveness or efficiency of existing systems and processes, and develop tangible new approaches to societal and planetary health and well-being. However, they also have the potential to communicate aspirations without meaningful follow-through, or to play into existing instrumental narratives of impact as a way of generating new income streams or climbing league tables. Whether an enabling or an achieving impact strategy, the power of these documents is in the specificity of the activities and accountability mechanisms that will enable aspirations for impact to be translated into the kinds of cultures that drive real, transformational change to meet twenty-first-century challenges.

            Declarations and conflicts of interest

            Research ethics statement

            Not applicable to this article.

            Consent for publication statement

            Not applicable to this article.

            Conflicts of interest statement

            The authors declare the following interests: Mark Reed is the Chief Executive Officer of Fast Track Impact Ltd and Saskia Gent is Director of Insights for Impact. Mark Reed and Saskia Gent provided consultancy support to Plymouth Marine Laboratory in the development of their impact strategy and have previously advised University of Hong Kong on their development of impact case studies. All efforts to sufficiently anonymise the authors during peer review of this article have been made. The authors declare no further conflicts with this article.

            References

            1. Adam P, Ovseiko PV, Grant J, Graham KE, Boukhris OF, Dowd AM, Balling GV, Christensen RN, Pollitt A, Taylor M, Sued O. 2018. ISRIA statement: Ten-point guidelines for an effective process of research impact assessment. Health Research Policy and Systems. Vol. 16(1):1–16. [Cross Ref]

            2. ARMA (Association of Research Managers and Administrators). 2020. The ARMA Survey on Research Culture 2020. Accessed 11 November 2021 https://arma.ac.uk/wp-content/uploads/2021/03/ARMA-Research-Culture-Survey-2020.pdf

            3. Armstrong F, Alsop A. 2010. Debate: Co-production can contribute to research impact in the social sciences. Public Money & Management. Vol. 30(4):208–10. [Cross Ref]

            4. Bayley JE, Phipps D. 2017. Real Impact: Institutional healthcheck workbook. Emerald Publishing. Accessed 11 November 2021 https://www.emeraldpublishing.com/wordpress/wp-content/uploads/Emerald-Resources-Institutional-Healthcheck-Workbook.pdf

            5. Bayley JE, Phipps D. 2019. Building the concept of research impact literacy. Evidence & Policy. Vol. 15(4):597–606. [Cross Ref]

            6. Braun V, Clarke V. 2006. Using thematic analysis in psychology. Qualitative Research in Psychology. Vol. 3:77–101. [Cross Ref]

            7. Braun V, Clarke V, Hayfield N. 2015. Thematic analysisSmith JA. Qualitative Psychology: A practical guide to research methods. London: Sage. p. 222–48

            8. Bruce A, O’Callaghan K. 2016. Inside out: Knowledge brokering by short-term policy placements. Evidence & Policy. Vol. 12(3):363–80. [Cross Ref]

            9. Campbell HJ, Vanderhoven D. 2016. Knowledge That Matters: Realising the potential of co-production. Manchester: N8 Research Partnership. Accessed 11 November 2021 https://www.n8research.org.uk/view/5163/Final-Report-Co-Production-2016-01-20.pdf

            10. Chapman JM, Schott S. 2020. Knowledge coevolution: Generating new understanding through bridging and strengthening distinct knowledge systems and empowering local knowledge holders. Sustainability Science. Vol. 15(3):931–43. [Cross Ref]

            11. Chubb J, Reed M. 2017. Epistemic responsibility as an edifying force in academic research: Investigating the moral challenges and opportunities of an impact agenda in the UK and Australia. Palgrave Communications. Vol. 3(1):20[Cross Ref]

            12. Chubb J, Reed MS. 2018. The politics of research impact: Academic perceptions of the implications for research funding, motivation and quality. British Politics. Vol. 13:295–311. [Cross Ref]

            13. Chubb J, Watermeyer R. 2017. Artifice or integrity in the marketization of research impact? Investigating the moral economy of (pathways to) impact statements within research funding proposals in the UK and Australia. Studies in Higher Education. Vol. 42:2360–72. [Cross Ref]

            14. Chubb J, Watermeyer R, Wakeling P. 2017. Fear and loathing in the academy? The role of emotion in response to an impact agenda in the UK and Australia. Higher Education Research & Development. Vol. 36(3):555–68. [Cross Ref]

            15. Cook T, Boote J, Buckley N, Vougioukalou S, Wright M. 2017. Accessing participatory research impact and legacy: Developing the evidence base for participatory approaches in health research. Educational Action Research. Vol. 25(4):473–88. [Cross Ref]

            16. Cvitanovic C, Löf MF, Norström AV, Reed MS. 2018. Building university-based boundary organisations that facilitate impacts on environmental policy and practice. PLoS ONE. Vol. 13(9):e0203752. [Cross Ref]

            17. Department for Business, Energy and Industrial Strategy. 2021. R&D People and Culture Strategy: People at the heart of R&D. Accessed 20 October 2022 https://www.gov.uk/government/publications/research-and-development-rd-people-and-culture-strategy

            18. Derrick G, Nickson A. 2014. Invisible intermediaries: A systematic review into the role of research management in institutional research processes. Journal of Research Administration. Vol. 45(2):11–45. Accessed 12 October 2022 https://files.eric.ed.gov/fulltext/EJ1157172.pdf

            19. Earl S, Carden F, Smutylo T. 2001. Outcome Mapping: Building learning and reflection into development programs. Ottawa: IDRC.

            20. Geuna A, Piolatto M. 2016. Research assessment in the UK and Italy: Costly and difficult, but probably worth it (at least for a while). Research Policy. Vol. 45:260–71. [Cross Ref]

            21. Heyeres M, Tsey K, Yang Y, Yan L, Jiang H. 2019. The characteristics and reporting quality of research impact case studies: A systematic review. Evaluation and Program Planning. Vol. 73:p. 10–23. [Cross Ref]

            22. Hill S. 2016. Assessing (for) impact: Future assessment of the societal impact of research. Palgrave Communications. Vol. 2:16073. [Cross Ref]

            23. Joly P-B, Matt M. 2017. Towards a new generation of research impact assessment approaches. Journal of Technology Transfer. Vol. 47:621–31. [Cross Ref]

            24. Kwarteng K. 2021. Business update: Statement made on 22 July 2021. Statement UIN HCWS246. Accessed 20 October 2022 https://questions-statements.parliament.uk/written-statements/detail/2021-07-22/hcws246

            25. Leeuwis C, Klerkx L, Schut M. 2018. Reforming the research policy and impact culture in the CGIAR: Integrating science and systemic capacity development. Global Food Security. Vol. 16:17–21. [Cross Ref]

            26. MacGregor S, Phipps D. 2020. How a networked approach to building capacity in knowledge mobilization supports research impact. International Journal of Education Policy and Leadership. Vol. 16(6):1–22. [Cross Ref]

            27. MacGregor S, Phipps D, Edwards CM, Kyffin J, Portes V. 2020. Active engagement of Canadian research institutions will foster the future of knowledge mobilization and research impact. Report prepared for the Social Sciences and Humanities Research Council. Ottawa: Accessed 15 October 2022 https://researchimpact.ca/kmb_resource/active-engagement-of-canadian-research-institutions-will-foster-the-future-of-knowledge-mobilization-and-research-impact/#_ftnref1

            28. Mayne J. 2015. Useful theory of change models. Canadian Journal of Program Evaluation. Vol. 30(2):119–142. [Cross Ref]

            29. McMillan T. 2020. Concordat for the Advancement of Knowledge Exchange in Higher Education. Universities UK. Accessed 20 October 2022 https://www.universitiesuk.ac.uk/sites/default/files/field/downloads/2021-07/knowledge-exchange-concordat.pdf

            30. Meyer M. 2010. The rise of the knowledge broker. Science Communication. Vol. 32(1):118–27. [Cross Ref]

            31. Ministry of Business, Innovation and Employment. 2020. ANZSRC. Accessed 20 October 2022 https://www.mbie.govt.nz/science-and-technology/science-and-innovation/research-and-data/anzsrc/

            32. Moran H, Karlin L, Lauchlan E, Rappaport SJ, Bleasdale B, Wild L, Dorr J. 2020. Understanding research culture: What researchers think about the culture they work in. Wellcome Open Research. Vol. 5:201. Accessed 20 October 2022 https://wellcomeopenresearch.org/articles/5-201

            33. NCCPE (National Co-ordinating Centre for Public Engagement). 2010. The EDGE Tool. Accessed 11 November 2021 https://www.publicengagement.ac.uk/support-engagement/strategy-and-planning/edge-tool

            34. Neal JW, Neal ZP, Brutzman B. 2021. Defining brokers, intermediaries, and boundary spanners: A systematic review. Evidence & Policy. Vol. 18(1):7–24. [Cross Ref]

            35. Oancea A. 2019. Research governance and the future(s) of research assessment. Palgrave Communications. Vol. 5:1–12. [Cross Ref]

            36. Penfield T, Baker MJ, Scoble R, Wykes MC. 2014. Assessment, evaluations, and definitions of research impact: A review. Research Evaluation. Vol. 23(1):21–32. [Cross Ref]

            37. Redman S, Greenhalgh T, Adedokun L, Staniszewska S, Denegri S; on behalf of the Co-production of Knowledge Collection Steering Committee. 2021. Co-production of knowledge: The future. British Medical Journal. Vol. 372:n434. [Cross Ref]

            38. Reed MS, Fazey I. 2021. Impact culture: Transforming how universities tackle twenty first century challenges. Frontiers in Sustainability. Vol. 2:662296. [Cross Ref]

            39. Reed MS, Bryce R, Machen RM. 2018. Pathways to policy impact: A new approach for planning and evidencing research impact. Evidence & Policy. Vol. 14(3):431–58. [Cross Ref]

            40. Reed MS, Ferré M, Martin-Ortega J, Blanche R, Lawford-Rolfe R, Dallimer M, Holden J. 2021. Evaluating impact from research: A methodological framework. Research Policy. Vol. 50(4):104147. [Cross Ref]

            41. Reichard B, Reed MS, Chubb J, Hall G, Jowett L, Peart A, Whittle A. 2020. Writing impact case studies: A comparative study of high-scoring and low-scoring case studies from REF2014. Palgrave Communications. Vol. 6(1):1–17. [Cross Ref]

            42. Rickards L, Steele W, Kokshagina O, Morales O. 2020. Research Impact as Ethos. RMIT University. Melbourne, Australia: Accessed 11 November 2021 https://cur.org.au/cms/wp-content/uploads/2020/09/rickards-et-al-2020-research-impact-as-ethos.pdf

            43. Rosso BD, Dekas KH, Wrzesniewski A. 2010. On the meaning of work: A theoretical integration and review. Research in Organizational Behavior. Vol. 30:91–127. [Cross Ref]

            44. Royal Society. n.d.. Research culture. Accessed 20 October 2022 https://royalsociety.org/topics-policy/projects/research-culture

            45. Ryan RM, Deci EL. 2000. Intrinsic and extrinsic motivations: Classic definitions and new directions. Contemporary Educational Psychology. Vol. 25(1):54–67. [Cross Ref]

            46. Sreenan N, Hinrichs-Krapels S, Pollitt A, Rawlings S, Grant J, Wilkinson B, Pow R, Kinloch E. 2019. Impact by design: Planning your research impact in 7Cs [version 1; peer review: 2 approved with reservations]. Emerald Open Research. Vol. 1:18[Cross Ref]

            47. Stanley A. 2016. Strengthening Networks and Building Relationships to Increase the Impact of Global Development Research. Impact Lab Learning Guide. Accessed 11 November 2021 https://www.theimpactinitiative.net/impactlab/learning_guide/strengthening-networks-and-building-relationships-increase-impact-global

            48. Tsey K, Onnis L, Whiteside M, McCalman J, Williams M, Heyeres M, Lui SM, Klieve H, Cadet-James Y, Baird L, Brown C, Watkin Lui F, Grainger D, Gabriel Z, Millgate N, Cheniart B, Hunter T, Liu H, Yinghong Y, Yan L, Lovett R, Chong A, Kinchin I. 2019. Assessing research impact: Australian Research Council criteria and the case of Family Wellbeing research. Evaluation and Program Planning. Vol. 73:176–86. [Cross Ref]

            49. Ward V. 2017. Why, whose, what and how? A framework for knowledge mobilisers. Evidence & Policy. Vol. 13(3):477–97. [Cross Ref]

            50. Watermeyer R. 2019. Competitive Accountability in Academic Life: The struggle for social impact and public legitimacy. Cheltenham: Edward Elgar.

            51. Weißhuhn P, Helming K, Ferretti J. 2018. Research impact assessment in agriculture: A review of approaches and impact areas. Research Evaluation. Vol. 27(1):36–42. [Cross Ref]

            52. Wenger-Trayner E, Wenger-Trayner B. 2015. Introduction to communities of practice: A brief overview of the concept and its uses. Accessed 20 October 2022 https://www.wenger-trayner.com/wp-content/uploads/2022/06/15-06-Brief-introduction-to-communities-of-practice.pdf

            Author and article information

            Journal
            rfa
            Research for All
            UCL Press (UK )
            2399-8121
            22 November 2022
            : 6
            : 1
            : 24
            Affiliations
            [1 ]Rural Policy Centre and Thriving Natural Capital Challenge Centre, Department of Rural Economy, Environment & Society, Scotland’s Rural College (SRUC), Edinburgh, UK
            [2 ]Insights for Impact, Basingstoke, UK
            [3 ]School of Global Studies, University of Sussex, Brighton, UK
            [4 ]Natural Resources and Sustainable Development, Department of Earth Sciences, Uppsala University, Sweden
            [5 ]School of GeoSciences, University of Edinburgh, UK
            [6 ]WWF Scotland, Edinburgh, UK
            Author notes
            Author information
            https://orcid.org/0000-0002-8958-8474
            https://orcid.org/0000-0002-8133-2968
            https://orcid.org/0000-0002-7683-0428
            https://orcid.org/0000-0001-9489-4700
            Article
            10.14324/RFA.06.1.24
            30b24e33-d4a9-4c2d-9237-c25841c386b2
            Copyright 2022, Mark S. Reed, Saskia Gent, Fran Seballos, Jayne Glass, Regina Hansda and Mads Fischer-Møller

            This is an open-access article distributed under the terms of the Creative Commons Attribution Licence (CC BY) 4.0 https://creativecommons.org/licenses/by/4.0/, which permits unrestricted use, distribution and reproduction in any medium, provided the original author and source are credited.

            History
            : 15 June 2021
            : 01 July 2022
            Page count
            Figures: 3, Tables: 4, References: 52, Pages: 33
            Categories
            Article

            Assessment, Evaluation & Research methods,Education & Public policy,Educational research & Statistics
            research impact,valorisation,knowledge exchange,knowledge transfer,KMb,impact strategy,impact culture

            Comments

            Comment on this article