From Davis Balestracci – "Which of Dr. Deming's 14 points should I start with?" Answer: NONE of them!

Published: Mon, 10/24/16

From Davis Balestracci – "Which of Dr. Deming's 14 points should I start with?"

Answer:  NONE of them...
Your style of writing is consistent with the elements of quality that you are teaching: relevant, focused, simple. I'm glad to see I am not the only one worn out by "belts" and "tools."  -- LinkedIn post feedback

[Take 5 to 6 minutes to read over a break or lunch]
 

Have you ever heard something similar?

 
"I'm committed to Dr. Deming's approach [or Six Sigma or Lean or TPS (it doesn't matter)], but executives don't seem to listen any more.  All they do is keep interrupting my very clear explanations with, 'Show me some results, then show me what to do.' I was shocked that my red bead experiment demonstration neither awed nor convinced themseveral of them even walked out during it! Help me. Which of Dr. Deming's 14 Points should I start with to get their attention and results they want?" 

My answer would indeed be:  NONE of them...and ALL of them!


Hi, Folks,
If anyone either continues to ask this last question or is confused by my answer, please read this, then heed the following advice from Dr. Deming himself.  Why? Because: you don't quite get Dr. Deming's message.

From The Best of Deming, by Ron McCoy:

  • "We are being ruined by best efforts."

  • "Judging people does not help them."

  • "If you stay in this world, you will never learn another one."

  • "Does experience help? NO! Not if we are doing the wrong things."

  • "There is nothing more costly than a hack."
 

Data INsanity:  "Off we go to the Milky Way!"...yet again!

 
How many of you have to endure quarterly review meetings – the dreaded "account for" results versus goals – usually arbitrary numerical ones – and the ensuing “What are you going to do about  [insert specific negative variance here]?”  Let me suggest how you can now easily amuse yourself during these dreadful meetings, then get a ton of respect and credibility from your colleagues after the meeting.

Missed appointments for physical therapy at a medical center were an ongoing costly problem.  The national standard was 20 percent, and the department supervisor had been asked at the end of the previous year to stretch and set a "tough" goal.  She decided on half of the national standard – 10 percent.

Here are some actual data presented at a year-end review, which, as you can see, is afflicted with the current toxic plague of “red…yellow…green” stoplight data presentation.


On the surface, the year's performance was not too bad:  nine greens, one yellow, and two reds. But someone astutely observed, "Both reds were in the second half of the year though. After July's red, there was a nice trend down. Good work! But the trend of the last couple of months and a red December are not good signs. Can't you do what you did in August, September, and October again?"

This red December performance cast a real pall on things. She was grilled about the disturbing trend and why it was so high so late in the year. Doubt began to creep in whether her improvement efforts were effective – they had obviously slipped.

This was reinforced when the indicator's overall yearly average performance of 10 percent was "up" when compared to the previous year's performance of 9.4 percent (two boxes in lower left corner). Even more pointed questions resulted, and there was discussion about making the goal even "tougher" for next year.  

Deja vu?
 

The alternative?  Simpler than you might think

 
The data are right in front of you in the bottom row of the table.  It takes only a few minutes to sketch a run chart (time-ordered plot with data median drawn in as a reference line):
 
Even though the data are limited, nothing looks amiss. There is neither the presence of any trends – five or six successive increases or decreases – nor a run of eight consecutive points either all above or all below the median (indicative of a shift, e.g. possible improvement (1) if it happens early in the data  being above the median or (2) if it happens later in the data being below the median).

In another five minutes, you could easily come up with the following process behavior chart (Individuals Chart)  [Note to any nitpickers: I'm intentionally not using a p-chart]:
 
The process has been stable the entire year, i.e., common cause – (1) all data points within limits, (2) no special cause tests triggered. Currently, any one month's performance will randomly fluctuate between five and 15 percent, which encompasses all three traffic lights' alleged special cause endpoints. Additionally, any one month can differ from its immediate predecessor by as much as 6.3 percent.  

In other words, each data point is merely statistical variation on a process perfectly designed to produce, consistently, 10 percent cancellations/no shows.

The result of all her hard effort has been...? 

Oh, and the math required to create this  chart?  Basic multiplication and addition and the abilities to (1) count to 8; (2) subtract two numbers; and (3) sort a list of numbers from lowest to highest.

No belt required.

Do you realize that you are perfectly designed to get the process results you're already getting?  Unless you know this, any well-intended (but ultimately unsuccessful) efforts to improve such a stable process is treating common cause as special.  "Efforts to improve the process"  has now become part of both the process's inputs and another component of its natural common cause variation!
 
This scenario was one of several given to me when a large organization asked me to speak at a Lean Six Sigma conference. By then, it was eight months into the next year, and I was able to add this additional data to the previous year, which resulted in the following process behavior chart:
 

I wasn't sure what the new goal was (it really doesn't matter).  Regardless, I saw no change from the previous year (the last moving range is not a special cause – it's < 6.3). Incorporating this additional data into the calculations hardly changed the common cause limits. The previous year's graph with its twelve data points was a good enough initial estimate of the situation. So much for the people who insist you must have 20 to 30 data points for an accurate chart.

Then I got really curious and asked whether they had any more data. They gave me the data for the year prior to that of the first chart above, which resulted in this process behavior chart for all 32 months:
 
Despite all her hard work, there is no convincing evidence that anything had changed over the past 32 months.

After presenting this to a room full of Lean and Six Sigma practitioners, I was met with a stunned silence.

What should you do?
 

Let me first tell you two things not to do

 
(1) There is a widespread, naive assumption being taught that a chart showing only common cause indicates the need for a total process redesign. This is not necessarily (and usually not) true!  

You must resist any knee-jerk urge either to form a process redesign team or brainstorm a cause-and-effect diagram answering, "Why do we have cancellations or no shows?"

Have you all made my past mistake of facilitating the similar "cause-and-effect diagram from hell" that will surely result?

(2) And please, please tell me that you've gotten beyond the pedantic response, "Dr. Deming says that common cause means it's management's fault and it's up to them to fix it!"

I'm ashamed to admit that I have been very guilty of this approach – over 30 years ago.  Funny thing, no executive ever said "Thank you!" Besides...

...Dr. Deming never said that. It's the common phenomenon of over 30 years of his Funnel Rule #4 yet again manifesting on something he originally said.  

Is it any wonder why leadership might easily turn a deaf ear to you, especially if (when?) (1) gets vague results?

So what should one do after constructing this plot? – "Don't just do something. Stand there!"

Hint:  By not working on meeting the goal, you will meet the goal.

Similar irony:  By not working specifically on any of the 14 Points while using the appropriate common cause strategy, you will now be able to implicitly work simultaneously on all of them.

Confused? 

How about even more confusion? –  She has been meeting the 10% goal the entire 32 months!

To be continued.

Kind regards,
Davis
==============================================================================
P.S. The quiet use of data sanity is your catalyst to get out of this world and "learn another one"
"built-in improvement" vs. being stuck in the "bolt-on quality" world.
==============================================================================
Data Sanity: A Quantum Leap to Unprecedented Results is a unique synthesis of the sane use of data, culture change, and leadership principles to create a road map for excellence.

One of its major goals is to create a common organizational language for healthier dialogue about reducing ongoing confusion, conflict, complexity, and chaos. 
​​​​​​​
Click here for ordering information [Note:  an e-edition is available] or here for a copy of its Preface and chapter summaries (fill out the form on the page).

[UK and other international readers who want a hard copy:  ordering through U.S. Amazon is your best bet]



=============================================================================
Please know that I always have time for you and am never too busy.to answer a question, discuss opportunities for a leadership or staff retreat, webinar, mentoring, or public speaking --  or just about any other reason!  Don't hesitate to e-mail or phone me.

==============================================================================
Please visit my LinkedIn page to listen to a 10-minute podcast or watch a 10-minute video interview where I talk about data sanity.

=========================================================
Was this forwarded to you?  Would you like to sign up?
=========================================================
If so, please visit my web site -- www.davisdatasanity.com -- and fill out the box in the left margin on the home page, then click on the link in the confirmation e-mail you will immediately receive.