ABCs of KMBrain power

Potential alternative approaches to evaluating knowledge management (KM) program performance

A question often raised in knowledge management (KM) forums is how to show return on investment in KM. Most recently, a member of the SIKM Leaders Community asked:

The question which is always asked on KM performance is, what is the benefit. Change in culture, improvement in collaborating, creation of assets, etc. is not an answer to this. since this is just an output but not measure of benefit/success. Can someone share, how we can correlate KM with business benefits OR how can we show return on investment in KM. Thank you.

Demonstrating return on investment has also been an issue for the natural resource management (NRM) sector, where I’ve worked for much of my career. Indeed, a 2008 performance audit report from the Australian National Audit Office (ANAO) examining the regional delivery of nearly $3 billion in funds through the Natural Heritage Trust (NHT) states that1:

Performance monitoring, evaluation and reporting are essential for determining the extent to which agencies’ outputs and administered items contribute to the achievement of the program outcomes. ANAO audits in 1996–97, 2000–01 and again in 2004–05 found weaknesses in the monitoring and reporting of the performance of the NHT. In summary there was no effective outcomes reporting.

Aligning with the question above to the SIKM community, Australia’s state and territory agencies and 56 regional NRM bodies could clearly demonstrate outputs from the investment – that is, activities undertaken such as tree planting, environmental weed removal, and training events – but much less so the outcomes achieved through these activities.

In response, the NRM sector reached out to the evaluation community for advice, and as stated in the 2008 ANAO report, then trialled and implemented program logic and performance story reporting. At that time, the NRM sector had already reached out to the KM community for advice, establishing the Knowledge for Regional NRM Program which I worked on from 2006-8. The activities of the Knowledge for Regional NRM Program included support for program logic and performance story reporting.

Program logic and performance story reporting are introduced below, together with two other approaches that are also increasingly being used in the NRM sector and beyond – most significant change (MSC) and multiple lines and levels of evidence (MLLE).

Just as the NRM sector has benefitted from the knowledge of the evaluation and KM communities, the KM community can potentially benefit from the knowledge of the evaluation and NRM communities. The numerous examples of research and case studies from the NRM sector in the taking responsibility for complexity series already demonstrate how the KM community can learn from the NRM sector’s extensive experience and expertise in managing in the face of complexity.

Program logic and performance story reports

As well as being used in the NRM sector, program logic and performance story reports are used widely in evaluating the performance of programs in both government and the private sector.

A program logic model sets out the resources and activities that comprise the program, and the outputs and outcomes that are expected to result from them. The logic model is both a tool for program planning and for evaluation because it draws out the relationships between resources, activities, and outcomes. It is also a very useful tool for communication and engagement, including communicating program successes to senior management.

The following resources provide comprehensive guidance in how to develop a program logic, and then based on the logic, tell a performance story for a program:

  • Exploring Program Logic – video from the New South Wales (NSW) Ministry of Health (can also be viewed at the top of this page)

Most significant change (MSC)

The most significant change (MSC) technique is a form of participatory monitoring and evaluation. It is participatory because many project stakeholders are involved both in deciding the sorts of change to be recorded and in analysing the data, and it contributes to evaluation because it provides data on impact and outcomes that can be used to help assess the performance of the program as a whole.

The following resources introduce the MSC technique and provide guidance in its use:

Multiple lines and levels of evidence (MLLE)

Establishing cause and effect relationships is challenging when seeking to evaluate the performance of complex interventions. As I’ve discussed in a previous RealKM Magazine article, the multiple lines and levels of evidence (MLLE) approach is very useful in this regard. It involves bringing together different types of evidence, and systematically analysing the strength of the causal argument linking an intervention or a cause to its effects.

The following resources introduce the MLLE approach and provide guidance in its use:

  • Learning from the evidence about evidence-based policy10 (see page 204) – discussion of the MLLE approach in a presentation given as part of the Australian Productivity Commission 2009 roundtable on the topic of “Strengthening Evidence-Based Policy in the Australian Federation”
  • Evaluation Design – brief guide from Environment NSW on how to use the MLLE approach

References:

  1. Australian National Audit Office (2008). Regional Delivery Model for the Natural Heritage Trust and the National Action Plan for Salinity and Water Quality: Department of the Environment, Water, Heritage and the Arts; Department of Agriculture, Fisheries and Forestry. Auditor-General Report No. 21 of 2007–08.
  2. McLaughlin, J. A., & Jordan, G. B. (1999). Logic models: a tool for telling your programs performance story. Evaluation and program planning, 22(1), 65-72.
  3. Centre for Epidemiology and Evidence. Developing and Using Program Logic: A Guide. Evidence and Evaluation Guidance Series, Population and Public Health Division. Sydney: NSW Ministry of Health, 2017.
  4. Roughley, A. (2009). Developing and Using Program Logic in Natural Resource Management, User Guide. Australian Government Land and Coasts (AGLC).
  5. Roughley, A. & Dart, J. (2009). Developing a Performance Story Report, User Guide. Australian Government Land and Coasts (AGLC) & Clear Horizon.
  6. Bond, A., Morgan, A., & O’Connor, P. (2009). Outback to the Ocean Performance Story Report. O’Connor NRM Pty Ltd.
  7. Clear Horizon (2008). Mount Lofty Ranges Southern Emu-wren and Fleurieu Peninsula Swamps Recovery Program Performance Story Report.
  8. Serrat, O. (2009). The Most Significant Change Technique. Knowledge Solutions, 25. Asian Development Bank (ADB).
  9. Davies, R. & Dart, J. (2005). The ‘Most Significant Change’(MSC) technique, A Guide to its Use.
  10. Rogers, P. J. (2010). Learning from the evidence about evidence-based policy. In Strengthening Evidence Based Policy in the Australian Federation, Volume 1: Proceedings. Roundtable Proceedings, Productivity Commission, Canberra.
  11. Norris, R., Nichols, S., Ransom, G., Liston, P., Barlow, A., and Mugodo, J. (2008). Causal Criteria Methods Manual Methods for applying the multiple lines and levels of evidence (MLLE) approach for addressing questions of causality. eWater Cooperative Research Centre & Institute for Applied Ecology, School of Resource Environment & Heritage Science, University of Canberra.
  12. Food and Agriculture Organization of the United Nations (2016). Final evaluation of the Improved Global Governance for Hunger Reduction Programme – Annexes. April 2016.
  13. Barnes, J. and Bishop, J. (2016). Review of UN SWAP Evaluation Performance Indicator Reporting. Prepared ImpactReady LLP for Review Management Group, UNEG Working Group on Gender Equality and Human Rights.
  14. Boyes, B. (2006). Determining and managing environmental flows for the Shoalhaven River, Report 1 – Environmental Flows Knowledge Review. NSW Department of Natural Resources, May 2006.
  15. Ogden, R., Davies, P., Rennie, B., Mugodo, J. and Cottingham, P. (2004). Review of the 1999 ACT Environmental Flow Guidelines. A report by the CRCFE to Environment ACT. November 2004.

Also published on Medium.

Bruce Boyes

Bruce Boyes (www.bruceboyes.info) is editor, lead writer, and a director of the award-winning RealKM Magazine (www.realkm.com) and currently also teaches in the University of NSW (UNSW) Foundation Studies program in China. He has expertise and experience in a wide range of areas including knowledge management (KM), environmental management, program and project management, writing and editing, stakeholder engagement, communications, and research. Bruce holds a Master of Environmental Management with Distinction and a Certificate of Technology (Electronics). With a demonstrated ability to identify and implement innovative solutions to social and ecological complexity, Bruce's many career highlights include establishing RealKM Magazine as an award-winning resource for knowledge managers, using agile and knowledge management approaches to oversee the implementation of an award-winning $77.4 million river recovery program in western Sydney on time and under budget, leading a knowledge strategy process for Australia's 56 natural resource management (NRM) regional organisations, pioneering collaborative learning and governance approaches to support communities to sustainably manage landscapes and catchments, and initiating and teaching two new knowledge management subjects at Shanxi University in China.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Back to top button