ABCs of KMEvidence-based knowledge management

Some things Professor Rob Briner thinks he got wrong in trying to promote evidence-based practice

This article is part of an ongoing series of articles on evidence-based knowledge management.

Professor Rob Briner is Co-founder & Scientific Director of the Center for Evidence-Based Management (CEBMa). As he is considered a leading global authority on evidence-based practice, his perspectives and publications are important components of RealKM Magazine‘s guidance and resources for evidence-based knowledge management (KM).

In a LinkedIn post, Professor Briner shares a list of some things he thinks he got wrong in trying, over the past 25 years, to promote the idea of evidence-based practice (EBP) in industrial-organizational (IO) psychology, human resources (HR), and management.

These things are:

  1. Focusing way too much on ‘the science’ (systematic reviews, pushing research) – important but it’s just one source of evidence and awareness doesn’t seem to change behaviour.
  2. Not acknowledging that universities and academics are, in general, part of the problem of low adoption not part of the solution.
  3. Framing it as part of the (tedious and misplaced) practitioner/academic divide/gap debate.
  4. Implying that scientists are somehow better or purer practitioners compared to managers when they are not (e.g., questionable research practices).
  5. Making EBP sound like a technocratic solution which can only be undertaken by experts, nerds, wonks, brainiacs, elites, etc.
  6. Being too myth-bustery – tends to alienate rather than engage – crucial to challenge dodgy stuff but needs to be much more sophisticated.
  7. Not sufficiently appreciating how the work context pulls practitioners away from EBP. In other words, not appreciating managers’ constraints and incentives.
  8. Implying practitioners are making mistakes or are silly or odd or not thinking straight.
  9. Sounding (and being) smug and preachy – “you really should do this it’s good for you.”
  10. Not being sufficiently clear about when an EBP approach makes more or less sense (e.g., based on problem importance).
  11. Not engaging effectively with reasonable objections to EBP (e.g., that it’s time-consuming).
  12. Insufficient focus on the ‘diagnosis’ part of EBP and too much focus on using EBP to find a solution (the “what works?” approach only works with clearly understood/diagnosed problems.
  13. Positioning EBP as something individuals or teams can do when it requires structural and systemic thinking and action.
  14. Not emphasizing that the quality of decision-making is about the process not the outcome (bit like saying an experiment was good because it got the result the researchers wanted).
  15. Failing to clarify EBP is not a one-off thing but needs to be part of a longer-term process of individual, group and organizational learning and development.
  16. Not starting from where practitioners are now and what they can realistically do now.
  17. Positioning EBP as about making really very well-informed decisions rather than better-informed decisions.
  18. Focusing on the teeny-tiny incy-wincy proportion of practitioners who want and are able to try EBP – sure it’s not for everyone but it’s also not for almost no-one.

Header image source: Adapted from Evidence Based by Nick Youngson on Alpha Stock Images which is licenced by CC BY-SA 3.0.

Rate this post

Also published on Medium.

RealKM Magazine

RealKM Magazine brings managers and knowledge management (KM) practitioners the findings of high-value knowledge management research through concise, practically-oriented articles.

Related Articles

Back to top button