6,076
views
0
recommends
+1 Recommend
2 collections
    0
    shares
      scite_
       
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Lessons from working across fields to develop a framework for informed choices

      research-article
      Bookmark

            Abstract

            In late 2018, Iain Chalmers, Andy Oxman and others from the Informed Health Choices team convened a cross-field forum to develop a generic framework of key concepts for thinking critically about claims, research and choices about interventions, with the aim of supporting ‘informed choices’. We define an informed choice as one that is based on critical understanding of the relevant available evidence. This paper describes the process of that cross-field engagement, and reflects on how consensus was reached on the generic framework. Working in an alliance of 24 researchers from across fields to develop the Key Concepts for Informed Choices framework, we learned three lessons about cross-field working: (1) there was much agreement, despite diversity of views and experiences; (2) the applications of our work were broader than we could have imagined; and (3) we identified a wide range of problems that we have in common when making informed choices. Here we describe our experience of working together to develop the framework, and draw out lessons for others who may be involved in similar cross-field initiatives.

            Main article text

            Key messages
            It is possible to reach consensus in cross-field engagements when these are structured to encourage openness and respect.
            There is considerable commonality in the evidence-informed approaches that exist in a wide range of fields.
            We need to document the processes of cross-field engagements, and not just their outputs.

            Background

            When Iain Chalmers, Andy Oxman and others from the Informed Health Choices team (www.informedhealthchoices.org) convened a cross-field forum to develop a generic framework to support people in making choices informed by critical understanding of the relevant available evidence (that is, ‘informed choices’), they did not know how such an interdisciplinary group would work. A generic framework to support these informed choices was subsequently produced and published elsewhere (Aronson et al., 2019). This companion paper describes how we collaborated, the underlying discussions, and our experiences of working together across disciplines and fields (see Box 1 for definitions). The project had an inherently experimental feel from the outset. When we began this task, for instance, we were not sure whether we would all agree on the nature of the problem we were working to solve, whether we could identify enough shared items for inclusion in our framework, or whether we could produce something that was applicable across the full range of potential uses that our fields represented.

            Box 1:

            Definitions

            Discipline A branch of knowledge as studied in higher education
            Field A sphere of activity, or a domain, that can incorporate more than one academic discipline
            Informed choice A choice that is based on critical understanding of the relevant available evidence

            There is a growing body of research on the value of breaking down traditional boundaries in academia (see, for example, Heinze et al., 2009). This is combined with increased awareness of the need to work in transdisciplinary and cross-field ways to prepare our students for the future, and to generate useful knowledge for decision making (Sharples et al., 2017). Several organisations, including the Global Research Council (2016), the Wellcome Trust (2017) and the Research Council of Norway (2019) have encouraged interdisciplinary innovation, but there is little understanding of how such innovation can occur. To gain a sense of the challenges that interdisciplinary work imposes, one must first understand that each discipline trains its followers to engage with ideas and with peers, using frameworks that often differ fundamentally from discipline to discipline. For example, psychology very broadly focuses on individuals and their rights, responsibilities, norms and practices, while sociology prioritises the collective, and focuses on a shared community-level identity and centres of control. Educational research operates generally under an assumption that each learner, classroom and school is unique. A clinician, on the other hand, seeks to individualise the treatment of a patient with some condition, based on the average results of research in a population. These examples demonstrate the widely varying paradigms within which different academic disciplines operate, and provide some idea of why cross-disciplinary work is more unusual and more challenging than one might expect.

            Evidence-based approaches focus on informing real-world choices. Helping individuals, communities, citizens and policymakers tackle important problems often requires solutions that span more than one field (Wilkins and Cooper, 2019). Improving evidence-informed decision making for a young child’s health, for example, requires input from primary schools, public health practitioners, clinicians and families. Addressing issues such as high youth unemployment requires evidence on economic growth, higher education and learning, business development, entrepreneurship and innovation. The more complex the problem, the greater the need to break down traditional disciplinary boundaries.

            The SARS-CoV-2 coronavirus disease (COVID-19) pandemic has highlighted the need for cross-field collaboration. For example, addressing the accompanying ‘infodemic’ (WHO, 2020a) requires more than epidemiology. Meanwhile, the recommendation to randomise school closures to improve certainty about their effects on both health and non-health outcomes (Fretheim et al., 2020) might be strengthened with collaborations involving colleagues from education and economics. There has been criticism that some national pandemic response teams are limited by their failure to include, for example, economists or behavioural or social scientists, while others have been challenged to be more transparent in their processes (see, for example, Clark, 2020). Meanwhile, the COVID-19 Evidence Network to support Decision-making (COVID-END) is bringing together several fields to help decision makers find the best available evidence, and to help researchers coordinate their efforts, rather than duplicating them (www.mcmasterforum.org/networks/covid-end).

            Perhaps encouragingly, a large number of disciplines within broad fields, such as health, education and development (see Table 1), have committed to a shared goal of ensuring that decisions are informed by the best available evidence, and have established their own norms and practices for supporting evidence-informed decision making. However, apart from a few exceptions, such as in health promotion and other allied health professions, these parallel initiatives seldom work together. This is despite a recognition of the potential for cross-field activities. Academic discussion and debate across evidence-based fields has largely taken place in silos, with limited cross-field discussion or fertilisation of ideas. For example, we know that parallel discussions have taken place in field-specific evidence-related conferences, such as the Cochrane Colloquia and the Collaboration for Environmental Evidence conferences, with little or no awareness that other experts in a different field are focusing on solving the same problems – the development of a multitude of methods for the synthesis of qualitative research is just one example (see Barnett-Page and Thomas, 2009). It was the combination of the shared commitment to the use of evidence across fields, and the recognition that the fields rarely work together, that motivated this cross-field initiative.

            Table 1:

            The range of disciplines, affiliations and countries included in the meeting (Source: Authors, 2022)

            DisciplineAffiliationCountry
            AgricultureCentre for Evidence-Based Agriculture, Harper Adams UniversityUK
            EconomicsInstitute of Health and Society, Newcastle UniversityUK
            Education/schoolsEducation Endowment Foundation
            Graduate School of Education and Statistics Department, University of Pennsylvania
            Kritikos
            UK

            USA
            UK
            EngineeringSchool of Engineering and Applied Science, Aston UniversityUK
            EnvironmentCollaboration for Environmental Evidence
            Centre for Evidence-Based Conservation, Bangor University
            Global
            UK
            International developmentInternational Initiative for Impact Evaluation (3ie)
            Africa Centre for Evidence, University of Johannesburg
            India
            South Africa
            Health care, including a number of specialisms (in particular: public health, general internal medicine, mental health)Centre for Informed Health Choices, Norwegian Institute of Public Health
            Faculty of Health Sciences, Oslo Metropolitan University
            Regional Centre for Child and Adolescent Mental Health and Child Welfare (Eastern and Southern Norway)
            Centre for Evidence-Based Medicine, University of Oxford
            Faculty of Medicine, University of Manchester

            Norway
            Norway

            Norway
            UK
            UK
            Journalism and media/fact checkingAfrica CheckAfrica
            Management and businessCentre for Evidence-Based Management (CEBMa)Global
            Museums/public engagementMuseum of Natural History, University of OxfordUK
            NutritionCentre for Evidence-Based Health Care, Stellenbosch UniversitySouth Africa
            Planetary healthCentre for Climate Change and Planetary Health, London School of Hygiene and Tropical MedicineUK
            Policing/crime reductionWhat Works Centre for Crime Reduction, College of PolicingUK
            PsychologyDepartment of Experimental Psychology, University of OxfordUK
            Social welfare/social workCampbell Collaboration
            School of Social Work, Saint Louis University
            Global
            USA
            Speech and language therapySchool of Health Sciences, University of Central LancashireUK
            Veterinary medicineCentre for Evidence-based Veterinary MedicineUK
            Interdisciplinarity and informed choices

            The Informed Health Choices (IHC) project team has developed key concepts for thinking critically about health-care claims, research and choices (Chalmers et al., 2018). In part because it is hard to change adults’ behaviours, they have focused largely on working with young people (Semakula et al., 2020). This reflects a general trend in this kind of work towards initiatives among schoolchildren or undergraduates (Oxman et al., 2021). The challenge, however, is not one that applies only in the health-care field, nor only to young people. Claims are made in the public arena every day about topics such as nutrition, ways of guaranteeing success in examinations, and how to make money quickly. Unfortunately, people often fail to question such claims, that is, to ask for or question supporting evidence. In recognition of the commonality of the problem of ensuring that decisions are informed by the best available evidence, and in the belief that solutions to our common problems are better sought together, the authors volunteered to join a cross-field group which convened in late 2018. Our aim was to develop a generic framework for assessing claims and supporting informed choices. The group included 24 researchers from a wide range of disciplines (see Table 1). What we had in common was that we all work to support the use of evidence in decision making in our respective fields.

            This paper shares lessons of interest to researchers and others working in cross-disciplinary collaborations. Our aim in sharing our observations is to engage further with what we have learned and identify areas for improvement to enable greater trans-disciplinary and cross-field work in the future.

            Approach

            We developed the shared framework in three phases: before, during and after a two-day meeting in Oxford (UK) in December 2018. Before the meeting, the IHC framework was circulated (Chalmers et al., 2018). All were encouraged to bring reflections on this to the meeting. This included sharing preliminary elements of the Informed Choices framework with colleagues, and getting their inputs into which elements they saw as relevant to their discipline. After the meeting, we continued to refine and develop our thinking by editing the emergent framework and preparing a report on our work (Aronson et al., 2019), and developing website content (https://thatsaclaim.org). Table 2 summarises aspects of our engagement.

            Table 2:

            Steps taken to elicit inputs across participants and encourage cross-field engagement (Source: Authors, 2022)

            Before the meeting
            • Disseminate pre-meeting tasks/documentation to enable major queries to be addressed before the meeting

            • Gather wider contributions by encouraging each of us to have conversations with our own teams

            During the meeting
            • Encouragement to share openly – ‘no wrong answers’

            • Show respect for all contributions

            • Encouragement to avoid field-specific language; take care in using technical language, and develop a common language that transcends the boundaries of the individual fields

            • Identify common ground

            • Take our own notes, including those about problems with working together across fields and lessons learned

            After the meeting
            • Participation in continuing engagement by email, refining the shared framework

            • Inclusive authorship and contributions to a shared paper based on the framework

            • Continuing engagement in each of our fields to tailor, adapt and exemplify the key concepts for our specific contexts

            • Disseminate within fields, and to general audiences, and actively promote its use

            Because of her personal interest in boundary-spanning initiatives, Ruth Stewart took notes about the process of working together across fields, which have directly contributed to this paper. In the subsequent months, she documented and shared key observations with the group by email, and invited others to reflect on these and help identify key lessons from the process of our cross-field working. Our shared reflections and contributions were distilled and grouped into themes, and are shared in this paper.

            The resulting framework

            Our cross-field work resulted in a framework that includes many concepts that are important to both laypeople and professionals aiming to make or support informed choices, within at least 14 fields. The concepts are in three overarching groups: (1) claims of effects that ‘should be supported by evidence from fair comparisons’; (2) comparisons that should be ‘fair comparisons, designed to minimise the risk of systematic errors (biases) and random errors (the play of chance)’; and (3) choices that ‘depend on judgements about the problem, the relevance (applicability or transferability) of evidence available and the balance of expected benefits, harm and costs’ (Aronson et al., 2019: 304). Each overarching group is unpacked within the cross-field framework reported in full and available online (https://thatsaclaim.org).

            Findings

            Our findings about the process of developing this framework reflect three broad themes:

            1. the extent and nature of agreement over the problems we were trying to solve, and when diversity of experiences and views were voiced

            2. the extent to which we identified common problems and prioritised them for inclusion in the framework we were developing, while also recognising differences

            3. the breadth of applicability of the framework we developed.

            In addition, we found several themes that cut across the elements of the Informed Choices framework that we developed. These cross-cutting themes are summarised in Table 3.

            Table 3:

            Common themes (Source: Authors, 2022)

            Elements of the final Informed Choices frameworkThemes of commonalities that arose from the cross-field discussions
            Claims: Claims about effects should be supported by evidence from fair comparisons. Other claims are not necessarily wrong, but there is an insufficient basis for believing them.Common observation that interventions get implemented in good faith based on a belief that something works, but belief is not enough to substantiate the claim, and resources can be wasted and harm caused by such claims.
            Claims are not made in a vacuum, but are influenced by a wide range of other factors, and often become the accepted norm, even if contrary to the evidence base.
            Small-scale studies are common across disciplines, and are part of the problem.
            There is a need for examples of harms that can occur, and resources wasted, when you are not making decisions based on evidence, to make the point in a powerful way so that people will take notice.
            Comparisons: Studies should make fair comparisons, designed to minimise the risk of systematic errors (biases) and random errors (the play of chance).There is a need for common terms for describing study designs that make fair comparisons, and that are designed to minimise biases.
            There is a lack of high-quality evidence on which to base decisions at all levels, from practice to policy.
            Choices: What to do depends on judgements about the problem, the relevance (applicability or transferability) of the evidence available, and the balance of expected benefits, harms and costs.Recognition of the trade-offs that decision makers make, and that these take place at several levels, balancing multiple benefits, harms and costs.
            Cognitive bias seems to be a stumbling block.
            The extent and nature of agreement about the problems we were trying to solve

            On the first day, we focused on understanding the work to date by those who had convened the meeting (the IHC project), and to develop shared understanding of the value of a cross-field framework. All participants acknowledged early on that our shared purpose centred on helping individuals and groups make well-informed decisions about interventions, primarily by improving the quality of communication of scientific information. We quickly recognised the value of working together, and the need to enable individuals to think critically about claims. At no point was the goal of developing a shared framework to support the use of evidence for informed choices challenged.

            Despite the breadth of backgrounds and experiences within the group, several of us had taken part in cross-disciplinary work previously, and we self-selected to an extent in accepting the invitation to take part in this initiative. We were therefore each already convinced about the value of working across traditional boundaries. Discussions at the meeting were rich, varied and wide ranging. We agreed that across all fields, choices (when options are considered) and decisions (when one action is chosen) are wide ranging and differ from one another in degree, scale and nature. Discussions across fields about the need for a common framework helped us to think more broadly about the nature of the informed choices people make, and the kinds of claims with which they are faced. We learned from both our commonalities and our differences.

            We also recognised the need to consider all levels of decision makers, from individuals making choices for themselves and their families, to health-care professionals and government policymakers making choices on behalf of individuals and whole populations. We recognised the extent to which public policy in its broadest sense (that is, policy that affects the public, whatever its source) is often shaped by a wide range of considerations. This included discussion of how policymakers understand public debate. We acknowledged that their decisions are shaped by perceived public support, official considerations, and a complex interaction between what the evidence suggests and how much policymakers are willing to risk political failure in following the evidence. We also perceived barriers for researchers/scientists in whether they inform policy, and how that occurs. We discussed the role that misinformation can play, and agreed on the importance of tackling it at all levels.

            Discussions across fields helped us to see how the framework we were developing could be useful beyond health choices. For example, we discussed the strength of the available evidence and agreed that small-scale studies, which are common across disciplines, are part of the problem behind many misleading claims. We found examples from each of our fields, for example, a systematic review of management research that found a predominance of small studies with low internal validity (Barends et al., 2014).

            The commonalities across fields not only related to experiences and ideas, but also to the degree of applicability of the framework across those fields. We all agreed that a mechanistic hypothesis, although important, is not on its own enough to justify implementation of an intervention until it has been tested. A key observation was that interventions, implemented in good faith based on a theory that something should work, can lead to wasted resources and even to harm. We did nevertheless observe that the development of theory, often using tools such as logic models, is a common precursor to any experiment, with differing levels of importance in different fields.

            We observed that informed choices are not made in a power vacuum. The power and influence behind some ideas are forceful. For example, microloans to people living in poverty, with the intention of supporting them to generate income through small businesses, are usually provided at very high interest rates, and an overwhelming body of evidence suggests that this does more harm than good. Nevertheless, the desire to find solutions for poverty makes the intervention hugely attractive and very popular. Muhammad Yunus – considered by many as the founding father of microfinance – was awarded the Nobel Peace Prize in 2006. It is difficult to challenge accepted truths and change norms.

            Other examples include the lack of global support for electric cars (Paine et al., 2006), the decades before the harms of smoking were accepted (Warner, 1991), and the limited action against the sugary drinks industry (Public Health England, 2019). Sometimes those with influence, whether big pharma, big donors or political parties, put their weight and their money behind an intervention, and it takes a lot to counter this. Most recently, we have observed persistence of the belief that hydroxychloroquine is an effective treatment for COVID-19, despite high-quality evidence to the contrary, following support for it from US President Donald Trump (Khuroo et al., 2020). A few public initiatives specifically challenge accepted truths, including Ask for Evidence (www.askforevidence.org). We agreed that all decision makers, including at national and international levels, have multiple factors to balance in decision making.

            We found common ground when considering the important roles that the media (including social media) play in influencing the choices that people make, and explored how to counter the power of misinformation in all forms of media. Another shared challenge related to the power of commercial interests, and how well-resourced (but not necessarily evidence-based) organisations can use their financial weight to drive policies and interventions that are extremely difficult to challenge without considerable funding and effort.

            The potential value of a generic framework of key concepts to inform choices was agreed, as reflected in a willingness to press on with understanding applications in different fields, and implications for the detailed sections of the framework (see more below).

            How we identified and prioritised common problems for inclusion in the framework

            When we considered adapting the original Informed Choices tool, which was developed for decisions in health care, into a framework for use across different fields, one of the main topics we discussed related to wording, the use of particular terminology and semantics, as opposed to differences in the concepts themselves. Colleagues in different disciplines use different jargon to characterise studies that focus on comparisons – randomised clinical trials, randomised social experiments, cluster-randomised trials, place-randomised trials. For example, in most disciplines, a randomised trial is referred to as an ‘experiment’, but in the domain of management, an experiment means ‘trying something out to see what happens’, and is a synonym for ‘pilot’. Nevertheless, it was understood that the phrases embody a common idea, that of fair comparisons in estimating outcomes. In another example, colleagues in each discipline alluded in varied ways to the tentativeness of evidence. We agreed that this concept of uncertainty is fundamental, and a reason for doing more than one study and for learning from mistakes.

            Once we had aired, discussed and recognised the subtle differences in how we use words such as ‘trial’, ‘experiment’, ‘decision’, ‘policy’ and ‘harm’, we moved on to discussing the framework itself. This helped us to identify common factors that shape decisions, which in turn were included in the new cross-field framework.

            We observed that across all fields, decision making involves trade-offs at several levels, balancing multiple benefits, harms and costs. We learned that in some fields it is routine to consider harms and costs (medicine, dentistry, veterinary medicine and international development, for example). In other fields, benefits, harms and costs might take longer to become apparent, and harms may not even be considered, for example, in education (Zhao, 2017). With some decisions it is necessary to balance benefits, harms and costs across multiple people, groups or populations of other species, for example, in environmental management. In some fields, such as planetary health, some people for whom this balance must be considered may not even have been born yet. Costs vary enormously across fields. They include both fiscal costs and opportunity costs, and they can be immediate or long term. In agriculture, for example, intensification through increased use of pesticides and/or fertilisers may meet increasing food production requirements, but can cause harm in the wider environment. In the longer term, there may also be cost implications, particularly to future generations, as damage to natural support services, such as soils or pollinators, can have economic consequences.

            Opportunity costs vary according to context: what we have to give up in order to adopt a course of action can vary for each individual, and from group to group and place to place. In resource-poor settings, the opportunity cost of seemingly trivial actions can be enormous. Furthermore, we recognised that across fields, decisions can be made at many levels (personal, for family or friends, in a school or hospital, in a city, regionally, nationally or internationally), and the distribution of harms, benefits and costs varies with context. Who gains and who loses can sometimes be as important as maximising an outcome, if not more so. In evidence-based management, this is known as ‘stakeholder evidence’ (yet another term which means different things in different fields) – evidence that allows analysis of impact on different groups and introduces an ethical dimension. The need for fair comparisons to enable balancing of options cuts across fields and levels of decision being made.

            Language commonly generates tension in all fields; for example, the contrast between warning language and neutral statements – ‘Beware false news’ versus ‘This is what an evidence-based claim looks like’. We agreed that the framework should contain statements that are as neutral as possible about the value of evidence, which could be adapted in different fields and for different uses. For example, the framework states: ‘Comparisons of interventions should be fair’ (Aronson et al., 2019: 304), rather than ‘Beware comparisons that are unfair’.

            We developed the option for each field of enhancing the framework and sharing additional field-specific information. We agreed that claims are not made in a vacuum, and we discussed the importance of ethical issues. However, we decided not to include them in the framework, because they can be context specific. We agreed that irrespective of the field, there is a need for relevant examples of what happens when decision making is not evidence based, so that people will take notice. These have been included in field-specific sections of the project website (www.thatsaclaim.org).

            The breadth of applicability of the framework

            While we discussed how we might adapt the framework for different fields, there was never any suggestion that the framework would not be useful in any specific field. Rather, there was a lot of common ground across fields. Participants described cross-field work as a pleasure, reflecting a new way of working without disciplinary boundaries, engendering optimism. We felt reassured that we all speak the same broad language, and that the principles of evidence-informed decision making (and thus the framework that we developed) apply across all fields. To date, the project website includes field-specific enhancements for 8 of the 14 fields involved in the development of the framework (www.thatsaclaim.org).

            Finally, there was a shared recognition that the framework is not an endpoint for any of our fields. Implementation of decisions is an essential aspect of informed choices, and is often lacking, whether we are discussing implementation of individual or policy choices. Policy choices in any field require implementation.

            Discussion

            Cross-field, interdisciplinary approaches are possible, and the learning from them is hard to overestimate. With COVID-19 dominating many current policy debates, and the proliferation of related research requiring synthesis, there are calls for disciplinary breadth (Stewart et al., 2020; WHO, 2020b). The lessons from our work are highly relevant for anyone convening these forums. Agencies that aim to tackle other global priorities, from climate change to Black Lives Matter, all require cross-field engagement. We hope that in sharing our experiences, we can contribute a little to these engagements.

            An increasing number of organisations and initiatives support the use of evidence in decision making, as well as seeking to address misinformation about claims. Fact-checking organisations, such as Africa Check, Ask for Evidence and iHealthFacts, are on the rise around the world. The potential to apply, reflect and collectively learn from cross-field engagements is considerable, and increasing in importance.

            We know of many other cases in which agreement has been more difficult to achieve. In 2003, the International Campaign to Revitalise Academic Medicine (ICRAM), led by an interdisciplinary working party, accepted the challenge of reinventing academic medicine, but found the task too difficult (ICRAM, 2004). Some within the group suggested that they had not understood the problem (Clark, 2005). Relative to ICRAM, our focus was much more specific, and we began with the common ground of evidence synthesis in supporting decision making.

            It has also been suggested that because disagreement on committees is rife, consensus in health care is reached only on ‘bland generalities that represent the lowest common denominator of debate and are embalmed as truths’ (Buetow et al., 1997: 269). However, we were not a ‘committee’ responsible for reaching a consensus, but had freedom to engage and to choose whether or not the resulting framework was useful for our own fields. We were also encouraged to adapt, and either broaden or narrow, the key concepts for our own fields. There was no pressure from a funder or other authority that required a consensus, or even an output.

            We suspect that respect for one another also encouraged constructive dialogue. We consistently shared concrete examples from our work, and this paper reports just a few of those. We believe that voicing opinion alone would not have been sufficient to reach consensus, and that the examples enabled us to find common ground.

            In conclusion, despite representing broad and varied interests, and belonging to traditionally siloed academic disciplines, we were able to reach agreement on a framework. Through the process of working together, we came to appreciate the breadth of applicability of the evidence-based approach. Readers who find the boundaries of traditional disciplines frustrating should be encouraged by our experience.

            Acknowledgements

            We are grateful to all the participants in this cross-field work, who, in addition to the authors, included: Astrid Austvoll-Dahlgren, Dorothy Bishop, Iain Chalmers, Marie Gaarder, Andy Haines, Carl Heneghan, Robert Matthews, Andrew (Andy) David Oxman, Hazel Roddam, Anel Schoonees, Ray Tallis and Nerys Thomas. Thanks are due to Sally Crowe and Patricia Atkinson for arrangements for the meeting.

            Declarations and conflicts of interest

            Research ethics statement

            The authors conducted the research reported in this article in accordance with the ethical code of the University of Johannesburg where the first author is based.

            Consent for publication statement

            The author declares that research participants’ informed consent to publication of findings – including photos, videos and any personal or identifiable information – was secured prior to publication.

            Conflicts of interest statement

            Ruth Stewart is a member of the Advisory Board of Research for All, in which this article is included. All efforts to sufficiently anonymise the authors during peer review of this article have been made. The authors declare no further conflicts with this article.

            Further reading

            Those interested in reading more about approaches for cross-field research might find the following of interest.

            On the various perspectives that different groups bring to research, and why one perspective is not enough:

            Crowe, S., Fenton, M., Hall, M., Cowan, K. and Chalmers, I. (2015) ‘Patients’, clinicians’ and the research communities’ priorities for treatment research: There is an important mismatch’. Research Involvement and Engagement, 1, 2. https://doi.org/10.1186/s40900-015-0003-x.

            Morris, S. and Stevenson, O. (2021) ‘The dynamics of working at intersections: Reflections from exploring inequalities’. Research for All, 5 (2), 356–65. https://doi.org/10.14324/RFA.05.2.11.

            On unpacking the involvement of different publics in research:

            Oliver, S., Liabo, K., Stewart, R. and Rees, R. (2015) ‘Public involvement in research: Making sense of the diversity’. Journal of Health Services Research & Policy, 20 (1), 45–51. https://doi.org/10.1177/1355819614551848.

            On the importance of titles and language when working with those outside your immediate professional circle:

            Stewart, R., Dayal, H. and Langer, L. (2017) ‘Terminology and tensions within evidence-informed decision-making in South Africa over a 15-year period’. Research for All, 1 (2), 252–64. https://doi.org/10.18546/RFA.01.2.03.

            On how unusual it has been for cross-field working to take place in some areas of the evidence approach:

            Stewart, R. and Oliver, S. (2006) ‘Reviewing the potential for critical appraisal training to cater for professional practice’. Medical Teacher, 28 (2), e74–e79. https://doi.org/10.1080/01421590600617509.

            On critical thinking in health care and education, and opportunities for collaboration:

            Sharples, J.M., Oxman, A.D., Mahtani, K.R., Chalmers, I., Oliver, S., Collins, K., Austvoll-Dahlgren, A. and Hoffmann, T. (2017) ‘Critical thinking in healthcare and education’. BMJ, 357, j2234. https://doi.org/10.1136/bmj.j2234.

            On the use of tools in interdisciplinary research in drug discovery:

            Haldeman, M., Vieira, B., Winer, F. and Knutsen, L.J. (2005) ‘Exploration tools for drug discovery and beyond: Applying SciFinder® to interdisciplinary research’. Current Drug Discovery Technologies, 2 (2), 69–74. https://doi.org/10.2174/1570163054064693.

            References

            1. Aronson JK, Barends E, Boruch R, Brennan M, Chalmers I, Chislett J, Cunliffe-Jones P, Dahlgren A, Gaarder M, Haines A, Heneghan C, Matthews R, Maynard B, Oxman AD, Oxman M, Pullin A, Randall N, Roddam H, Schoonees A, Sharples J, Stott J, Tallis R, Thomas N, Vale L. 2019. Key concepts for making informed choices. Nature. Vol. 572:303–6. [Cross Ref]

            2. Barends E, Janssen B, ten Have W, ten Have S. 2014. Effects of change interventions: What kind of evidence do we really have? Journal of Applied Behavioral Science. Vol. 50(1):5–27. [Cross Ref]

            3. Barnett-Page E, Thomas J. 2009. Methods for the synthesis of qualitative research: A critical review. BMC Medical Research Methodology. Vol. 9(1):59. [Cross Ref]

            4. Buetow SA, Sibbald B, Cantrill JA, Halliwell S. 1997. Appropriateness in health care: Application to prescribing. Social Science and Medicine. Vol. 45(2):261–71. [Cross Ref]

            5. Chalmers I, Oxman AD, Austvoll-Dahlgren A, Ryan-Vig S, Pannell S, Sewankambo N, Semakula D, Nsangi A, Albarqouni L, Glasziou P, Mahtani K, Nunan D, Heneghan C, Badenoch D. 2018. Key concepts for informed health choices: A framework for helping people learn how to assess treatment claims and make informed choices. BMJ Evidence-Based Medicine. Vol. 23(1):29–33. [Cross Ref]

            6. Clark G. 2020. Between science and policy – Scrutinising the role of SAGE in providing scientific advice to government. LSE Blog. 10–September Accessed 16 November 2020 https://blogs.lse.ac.uk/impactofsocialsciences/2020/09/10/between-science-and-policy-scrutinising-the-role-of-sage-in-providing-scientific-advice-to-government/

            7. Clark J; on behalf of ICRAM. 2005. Five futures for academic medicine: The ICRAM scenarios. BMJ. Vol. 331:101. [Cross Ref]

            8. Fretheim A, Flatø M, Steens A, Flottorp SA, Rose CJ, Telle KE, Kinge JM, Schwarze PE. 2020. COVID-19: We need randomised trials of school closures. Journal of Epidemiology & Community Health. Vol. 74:1078–9. [Cross Ref]

            9. Global Research Council. 2016. Statement of principles on interdisciplinarity. Accessed 16 November 2020 https://www.globalresearchcouncil.org/fileadmin/documents/GRC_Publications/Statement_of_Principles_on_Interdisciplinarity.pdf

            10. Heinze T, Shapira P, Rogers JD, Senker JM. 2009. Organizational and institutional influences on creativity in scientific research. Research Policy. Vol. 38(4):610–23. [Cross Ref]

            11. ICRAM. 2004. ICRAM (the International Campaign to Revitalise Academic Medicine): Agenda setting. BMJ. Vol. 329:787. [Cross Ref]

            12. Khuroo MS. 2020. Chloroquine and hydroxychloroquine in coronavirus disease 2019 (COVID-19): Facts, fiction and the hype. A critical appraisal. International Journal of Antimicrobial Agents. Vol. 56(3):106101. [Cross Ref]

            13. Oxman M, Habib L, Jamtvedt G, Kalsnes B, Molin M. 2021. Using claims in the media to teach essential concepts for evidence-based healthcare. BMJ Evidence-Based Medicine. Vol. 26:234–6. [Cross Ref]

            14. Paine C, Deeter J, Devlin D, Titus TM, Titus RD, Gibney A, Weiss K. 2006. Who Killed the Electric Car? [film]. Sony Pictures Home Entertainment.

            15. Public Health England. 2019. Sugar reduction: Report on progress between 2015 and 2018. London: Public Health England. Accessed 16 November 2020 https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/832182/Sugar_reduction__Yr2_progress_report.pdf

            16. Research Council of Norway. 2019 Interdisciplinary research: Constructing a level playing field. Policy Brief: Recommendations from the Research Council of Norway’s International Advisory Board. (Issue 1)January Accessed 16 November 2020 https://www.forskningsradet.no/contentassets/5358f3a91d2046818ca271c3f9209cf3/interdisciplinary-research-policy-brief-from-rcn-international-advisory-board-2019.pdf

            17. Semakula D, Nsangi A, Oxman AD, Austvoll-Dahlgren A, Oxman M, Rosenbaum S, Morelli A, Glenton C, Lewin S, Kaseje M, Chalmers I, Fretheim A, Ding Y, Sewankambo NK. 2020. Effects of the Informed Health Choices podcast on the ability of parents of primary school children in Uganda to assess the trustworthiness of claims about treatment effects: One-year follow up of a randomised trial. Trials. Vol. 21:187[Cross Ref]

            18. Sharples JM, Oxman AD, Mahtani KR, Chalmers I, Oliver S, Collins K, Austvoll-Dahlgren A, Hoffman R. 2017. Critical thinking in healthcare and education. BMJ. Vol. 357:j2234. [Cross Ref]

            19. Stewart R, El-Harakeh A, Cherian SA; on behalf of the LMIC members of COVID-END. 2020. Evidence synthesis communities in low-income and middle-income countries and the COVID-19 response. The Lancet. Vol. 396(10262):1539–41. [Cross Ref]

            20. Warner KE. 1991. Tobacco industry scientific advisors: Serving society or selling cigarettes?’. American Journal of Public Health. Vol. 81:839–42. [Cross Ref]

            21. Wellcome Trust. 2017 Our goals for the year ahead. Online News. 1–March Accessed 16 November 2020 https://wellcome.org/news/our-goals-year-ahead

            22. Wilkins T, Cooper I. 2019. Lessons from coordinating a knowledge-exchange network for connecting research, policy and practice. Research for All. Vol. 3(2):204–17. [Cross Ref]

            23. World Health Organization. 2020a. Novel Coronavirus (2019-nCoV): Situation report – 13. Geneva: WHO. Accessed 16 November 2020 https://www.who.int/docs/default-source/coronaviruse/situation-reports/20200202-sitrep-13-ncov-v3.pdf

            24. World Health Organization. 2020b. A Coordinated Global Research Map: 2019 novel coronavirus. Geneva: WHO R&D Blueprint. Accessed 16 November 2020 https://www.who.int/publications/m/item/a-coordinated-global-research-roadmap

            25. Zhao Y. 2017. What works may hurt: Side effects in education. Journal of Educational Change. Vol. 18:1–19. [Cross Ref]

            Author and article information

            Journal
            rfa
            Research for All
            UCL Press (UK )
            2399-8121
            08 February 2022
            : 6
            : 1
            : e06105
            Affiliations
            [1 ]Africa Centre for Evidence, University of Johannesburg, South Africa
            [2 ]Centre for Evidence-Based Medicine, Nuffield Department of Primary Care Health Sciences, University of Oxford, UK
            [3 ]Centre for Evidence-Based Management (CEBMa), Amsterdam, Netherlands
            [4 ]Graduate School of Education and Statistics Department, University of Pennsylvania, USA
            [5 ]Centre for Evidence-based Veterinary Medicine, School of Veterinary Medicine and Science, University of Nottingham, UK
            [6 ]Kritikos, Reading, UK
            [7 ]Westminster University, UK
            [8 ]School of Social Work, Saint Louis University, USA
            [9 ]Centre for Epidemic Interventions Research, Norwegian Institute of Public Health, Norway
            [10 ]Centre for Evidence-Based Conservation, School of Natural Sciences, Bangor University, UK
            [11 ]Centre for Evidence-Based Agriculture, Harper Adams University, UK
            [12 ]Education Endowment Foundation, UK
            [13 ]Museum of Natural History, University of Oxford, UK
            [14 ]Institute of Health and Society, Newcastle University, UK
            Author notes
            *Correspondence: ruths@ 123456uj.ac.za
            Author information
            https://orcid.org/0000-0002-9891-9028
            https://orcid.org/0000-0003-1139-655X
            https://orcid.org/0000-0002-6093-3959
            https://orcid.org/0000-0002-5318-5403
            https://orcid.org/0000-0002-4893-6583
            https://orcid.org/0000-0003-3023-7386
            https://orcid.org/0000-0002-9356-7318
            https://orcid.org/0000-0002-6323-9620
            https://orcid.org/0000-0001-5299-8042
            https://orcid.org/0000-0002-3023-8488
            Article
            10.14324/RFA.06.1.05
            2109d8e4-f073-4e81-9a9b-4fec7977e1d1
            Copyright 2022, Ruth Stewart, Jeffrey K. Aronson, Eric Barends, Robert Boruch, Marnie Brennan, Joe Chislett, Peter Cunliffe-Jones, Brandy Maynard, Matt Oxman, Andrew Pullin, Nicola Randall, Jonathan Sharples, Janet Stott and Luke Vale

            This is an open-access article distributed under the terms of the Creative Commons Attribution Licence (CC BY) 4.0 https://creativecommons.org/licenses/by/4.0/, which permits unrestricted use, distribution and reproduction in any medium, provided the original author and source are credited.

            History
            : 18 November 2020
            : 24 November 2021
            Page count
            Tables: 4, References: 25, Pages: 12
            Categories
            Article

            Assessment, Evaluation & Research methods,Education & Public policy,Educational research & Statistics
            cross-field,informed choices,consensus,evidence,interdisciplinary,transdisciplinary

            Comments

            Comment on this article