ABCs of KMEvidence-based knowledge management

Using narrative reviews, systematic reviews, and meta-analyses in evidence-based knowledge management (KM)

This article is part of an ongoing series of articles on evidence-based knowledge management.

As I’ve previously advised, evidence-based knowledge management (KM) involves four sources of evidence: the scientific literature, evidence from the organisation, evidence from practitioners, and evidence from stakeholders.

Systematic and narrative reviews

The scientific literature evidence source should include systematic reviews1 because they “can help address managerial problems by producing a reliable knowledge base through accumulating findings from a range of studies.” Because of this, I use systematic reviews as the basis of my RealKM Magazine articles wherever possible.

But does this mean that the other key approach to conducting a research review – the narrative review – is inappropriate as an evidence source?  Systematic reviews are generally placed above narrative reviews in the research evidence hierarchy, but a recent paper2 argues “that systematic reviews and narrative reviews serve different purposes and should be viewed as complementary.”

The paper gives the following definitions of narrative and systematic reviews:

  • A narrative review is a scholarly summary along with interpretation and critique. It can be conducted using a number of distinctive methodologies.
  • The defining characteristic of a systematic review in the Cochrane sense is the use of a predetermined structured method to search, screen, select, appraise and summarise study findings to answer a narrowly focused research question.

It then provides the following advice in regard to each type of review:

  • Narrative reviews provide interpretation and critique; their key contribution is deepening understanding.
  • Conventional systematic reviews address narrowly focused questions; their key contribution is summarising data.

So, in your evidence-based practice, you should use systematic reviews to answer specific questions about the suitability or applicability of a particular process, tool, or technique to a particular use or situation, and narrative reviews to deepen your overall understanding.

As well as using systematic reviews as the basis of many of my articles, I also use narrative reviews, and research and publish narrative reviews in the form of feature articles.

Meta-analyses

A common misconception is that systematic reviews and meta-analyses are the the same thing, and the terms are often used interchangeably. However, as a recent article on the Campbell Collaboration blog advises, they are different but related activities:

  • A systematic review is a detailed, systematic and transparent means of gathering, appraising and synthesising evidence to answer a well-defined question.
  • A meta-analysis is a statistical procedure for combining numerical data from multiple separate studies. A meta-analysis should only ever be conducted in the context of a systematic review.

The article provides the further advice to “Beware of meta-analyses that do not follow a systematic and transparent process for identifying and selecting which studies to include in analysis.”

A recent paper3 further alerts that:

Published research … can only be useful for theory or applications if it is credible. In science, a credible finding or hypothesis is one that has repeatedly survived high-quality, risky attempts at proving it wrong. The more such falsification attempts a finding survives, and the riskier those attempts are, the more credible a finding can be considered.

The currently dominant strategy to assess the credibility of an effect involves metaanalyzing all known studies on a topic. Such state-of-the-art meta-analytic approaches, however, cannot determine the true credibility of an effect because they do not account for the extent to which each included study has survived risky falsification attempts.

The paper proposes a unified framework to address this issue.

For knowledge management practitioners, the important messages are that it is the systematic review, rather than the associated meta-analysis, that is most important, and that there are potential shortcomings in meta-analysis.

Summary of advice for knowledge management (KM) practitioners

In their evidence-based practice, knowledge managers should use systematic reviews to answer specific questions about the suitability or applicability of a particular process, tool, or technique to a particular use or situation, and narrative reviews to deepen their overall understanding.

A narrative review is a scholarly summary along with interpretation and critique. It can be conducted using a number of distinctive methodologies.

A systematic review uses a predetermined structured method to search, screen, select, appraise and summarise study findings to answer a narrowly focused research question.

A meta-analysis is a statistical procedure for combining numerical data from multiple separate studies, and should only ever be conducted in the context of a systematic review. For knowledge management practitioners, it is the systematic review, rather than the associated meta-analysis, that is most important.

Header image source: Adapted from Evidence Based by Nick Youngson on Alpha Stock Images which is licenced by CC BY-SA 3.0.

References:

  1. Briner, R. B., & Denyer, D. (2012). Systematic review and evidence synthesis as a practice and scholarship tool. Handbook of evidence-based management: Companies, classrooms and research, 112-129.
  2. Greenhalgh, T., Thorne, S., & Malterud, K. (2018). Time to challenge the spurious hierarchy of systematic over narrative reviews?. European journal of clinical investigation, e12931.
  3. LeBel, E. P., McCarthy, R., Earp, B. D., Elson, M., & Vanpaemel, W. (2018). A Unified Framework To Quantify The Credibility Of Scientific Findings. OpenLeBel, Etienne P et al.“A Unified Framework to Quantify the Credibility of Scientific Findings”. PsyArXiv, 13.
Rate this post

Bruce Boyes

Bruce Boyes is a knowledge management (KM), environmental management, and education thought leader with more than 40 years of experience. As editor and lead writer of the award-winning RealKM Magazine, he has personally written more than 500 articles and published more than 2,000 articles overall, resulting in more than 2 million reader views. With a demonstrated ability to identify and implement innovative solutions to social and ecological complexity, Bruce has successfully completed more than 40 programs, projects, and initiatives including leading complex major programs. His many other career highlights include: leading the KM community KM and Sustainable Development Goals (SDGs) initiative, using agile approaches to oversee the on time and under budget implementation of an award-winning $77.4 million recovery program for one of Australia's most iconic river systems, leading a knowledge strategy process for Australia’s 56 natural resource management (NRM) regional organisations, pioneering collaborative learning and governance approaches to empower communities to sustainably manage landscapes and catchments in the face of complexity, being one of the first to join a new landmark aviation complexity initiative, initiating and teaching two new knowledge management subjects at Shanxi University in China, and writing numerous notable environmental strategies, reports, and other works. Bruce is currently a PhD candidate in the Knowledge, Technology and Innovation Group at Wageningen University and Research, and holds a Master of Environmental Management with Distinction and a Certificate of Technology (Electronics). As well as his work for RealKM Magazine, Bruce currently also teaches in the Beijing Foreign Studies University (BFSU) Certified High-school Pathway (CHP) program in Baotou, Inner Mongolia, China.

Related Articles

Back to top button