There is a naïve assumption that it takes good people with good ideas
doing their best to improve quality.
As Deming growled many times:
- "They already are...and that's the
problem!"
- "For every problem, there is a solution: simple, obvious...and wrong!"
Paraphrasing my mentor Heero Hacquebord: The most important problems are not the obvious ones. They are the ones of which no one is aware.
Remember the quality circles disaster in the 1980s? 20th century quality giant Joseph Juran's view was that success of these efforts required the need to be
grounded in an already strong viable improvement culture with executive support. Actually, Lean's foundation of "standardized work" is probably one of the best ways to initially focus such a process.
There is a danger that many "good ideas" could be naively applied to the more obvious, superficial symptoms of deeper, hidden problems. In addition, one must inevitably deal with resistance and unintended human variation every step of the way during
testing the idea and then trying to implement the idea. And there is the most non-trivial issue of collecting data during the change -- usually an afterthought and planned ad hoc…if any is collected at all.
My respected colleague
Mark Hamel has astutely observed:
- Human systems don't
naturally gravitate to discipline and rigor.
- Most folks are deficient in critical thinking, at least initially. [my emphasis]
Today? Tomorrow? Next Tuesday? Next week? -- in an environment where the leaders are not instilling discipline, prompting critical thinking, or facilitating daily
kaizen?
Application of rapid cycle PDSA requires important nuances and the uneven, dynamic, and messy reality of implementation. Any environment has its own unique challenges and opportunities -- the 20 percent of its process causing 80 percent of its problem. Change creates an interplay that is rarely neat
and linear, and it is very culture specific.
If you are at a conference session or read a paper that makes the process sound as simple as smooth uphill linear progress, they could either be naively (and dangerously) unconscious of the reality or have sanitized their situation beyond recognition (especially acceptance of the change). In any case, don't trust any touted results...and ask lots of questions!
Also
ask yourself, "What would Deming say?" And I've heard him: "What's your theory? Examples without theory teach nothing!"
If you apply critical thinking to your current efforts, you are guaranteed to run rings around the results of any neatly packaged and sanitized example that at best presents the 20 percent of their process that solved 80 percent of their problem -- in their unique
culture.
===============================
By all means, use rapid cycle PDSA
===============================
Remember: there is no avoiding being in the midst of the presence of variation everywhere -- including the variation in variation experienced among similar facilities! Effective application of rapid cycle PDSA requires being conscious of the need to complement its intuitive
nature with the discipline necessary for good critical thinking and appropriate formality.
This also includes improving the process of using it. After each cycle, ask:
- What unexpected "variation" was encountered? What was learned about the improvement process itself?
- How will it be improved for future interventions?
- What aspects of the specific environment were relevant to the
effectiveness of the intervention?
- What were the elements of the local care environment considered most likely to influence change/improvement?
- Will a similar intervention work in a different setting?
But remember the crucial importance of initially formulating strong theories to test. Make this a vital part of the "P."
Dialogue based on the following two questions could be a good place to start:
- Why does routine care delivery [or product quality] fall short of standards we know we can achieve?
- How can we close this gap between what we know can be achieved and what occurs in practice?
Until next time...
Kind regards,
Davis