Dr. Russell Ackoff’s list of antisystemic (sub-optimising) methods – part 4/8 – dissecting Benchmarking

This is part 4 of the series of blog posts that is elaborating on some of the methods in the list of managers’ panaceas [1], a list of anticipated antisystemic (sub-optimising) methods that Dr. Russell Ackoff brought up in the middle of the 1990s. Today’s blog post will dissect Benchmarking.

Specifically, the national surveys from the beginning of the 1990s, that Dr. Ackoff referred to, was with his own words regarding Benchmarking [1]: “Benchmarking. More than 50% have resulted in an increasing cost”.

The reason for doing Benchmarking for a company is to try to gain better understanding of its: sales, margin, labour productivity, expense management, efficiency, quality rating, unit costs, etc.
Measuring, comparing to competition, and identifying opportunities for improvements are the essence of Benchmarking [2], but does not provide the reasons behind the findings, the why.

Here is an example list of different types of Benchmarking [3]:

  • Process Benchmarking
  • Financial Benchmarking
  • Benchmarking from an investor perspective
  • Performance or Competitive Benchmarking
  • Product Benchmarking
  • Strategic Benchmarking
  • Functional Benchmarking
  • Best-in-class Benchmarking

Many definitions of Benchmarking are available, here is one example [2].

Benchmarking is the systematic process of measuring one’s performance against recognized leaders for the purpose of determining best practices that lead to superior performance when adapted and utilized.

(Construction Industry Institute, CII, 1995)

Benchmarking is often part of a change program, Continuous Improvement cycles for example, and can therefore be seen as two different steps, but somewhat intertwined. The first step is the benchmark and the second step is the actual change.
In the first step a measurement is done, internally between different functions, silos, etc. within the company, or a comparison with external companies for some measure. This will lead to a gap between current performance and wanted performance, where the latter often is best practice, best in class or industry standards.
The second step is to close this gap, do the actual change of the organisation, which many times is done with the
PDCA cycle or DMAIC (from Six Sigma), but sometimes also trying to change people.

But remember that all organisations are complex, which means that all changes done on the parts of the organisation, for example closing gaps as in this case, will lead to some changes of the processes or practices regarding the way of working. This means changes of the interactions within the organisation resulting in unintended consequences that cannot be foreseen and is the same as acting anti-systemically, or sub-optimising.

Dr. Ackoff put it very clear when he thought that people were denying the obvious [4]: “Improving the performance of the parts of a system taken separately will necessarily improve the performance of the whole. False. In fact, it can destroy an organization, as is apparent in an example I have used ad nauseum: Installing a Rolls Royce engine in a Hyundai can make it inoperable. This explains why benchmarking has almost always failed.”

Or as Dave Snowden puts it regarding systems dynamics, where Benchmarking is a part, but from a complexity theory perspective: “For some years now I been distinguishing systems thinking from complexity thinking in a variety of ways. Technically I suppose I mean systems dynamics, but the two are largely conflated these days……. I am not hostile to systems thinking, but I do think its basic assumptions make its various tools and approaches suitable for complicated but not complex thinking. For me the main difference is that most systems thinking approaches, in particular those which are shall we say “popular” focus on defining an ideal future state, then seek to close the gap.” [5]. He continues: “You can’t manage to a desired future state but have to manage the evolutionary potential of the situated present. You can’t predict the future, but you can increase resilience in there the here and now which will allow you to manage that uncertainty.” [6], and finally: “Balanced score card and benchmarking …The same is true for mission statements – all products of systems dynamics” [7].

Since we cannot do optimisation of the whole by doing optimisation of the parts, would mean that only Benchmarking done on the whole company in comparison with another company have value. But then of course, the only solution to close that gap would be to act systemically and solve the root causes to the problems, which would be the same regardless if Benchmarking had been done or not.

Since Toyota is acting systemically and has been in the forefront for success, they do certainly not need to benchmark.

Conclusions:
Benchmarking many times first excludes a part or layer from the system and measures it, and then tries to close the gap between current performance and the wanted performance, in most cases changing another part of the system, like processes. This is a double fault from a complexity theory perspective since both are done on the parts of the system, which clearly shows the ease for failure when not acting systemically.

In Our Prefilled Root Cause Analysis Map – for organisations, ver. 0_99 for the normal silo organisation, the implementation after the benchmark step has been done can look like this, showing some examples with sub-optimising areas; 1) Improvement cycles with PDCA and DMAIC techniques, will make changes of the processes of the organisation, in the same area as a silo organisation normally has one of its pain points. 2) People changes can be regarding culture, values, mindset, behaviour, etc. and is emerging over time and very far from the root causes.

Fairly easy judgement again.

Dr. Ackoff                                           Panaceas
3                               –                              0

In the next blog post we are going to dissect the next method in the list of managers’ panaceas; Process Reengineering, or Business Process Reengineering, BPR, which is the most common name today. C u then.

 

References:
[1] Ackoff, Dr. Russell Lincoln. Speech. “Systems-Based Improvement, Pt 1.”, Lecture given at the College of Business Administration at the University of Cincinnati on May 2, 1995.
The list at 03:30 min and the national surveys about Benchmarking at 04.32. Link copied 2018-10-27.
https://www.youtube.com/watch?v=_pcuzRq-rDU

[2] The National Academics of Sciences, Engineering, Medicine.
Link copied 2019-01-05.
https://www.nap.edu/read/11344/chapter/5#22

[3] Wikipedia. Benchmarking. Link copied 2018-12-21
https://en.wikipedia.org/wiki/Benchmarking

[4] Ackoff, Russell Lincoln. Article. Link copied 2018-12-15.
https://thesystemsthinker.com/a-lifetime-of-systems-thinking/

[5] Snowden, Dave. Blog post. Link copied 2019-01-10.
https://cognitive-edge.com/blog/babies-should-not-be-thrown-out-with-bathwater/

[6] Snowden, Dave. Blog post. Link copied 2019-01-10.
https://cognitive-edge.com/blog/systems-thinking-complexity/

[7] Snowden, Dave. Blog post. Link copied 2018-12-22.
https://cognitive-edge.com/blog/the-mad-sculptor/