Tuesday, May 17, 2011

The Human Factor in Healthcare

Several times recently, I've been asked by manufacturing folks about the challenges of making the move to healthcare as a lean coach.  The one challenge that I always emphasize is the human factor.  The human factor exists in every industry, but it's magnified in healthcare.  This is partly due to the manual nature of the work, partly due to to the fact that the product is the patient, and partly due to the unique cultural aspects of working in an organization that directly saves lives on a daily basis.

As a lean coach in healthcare, one must adjust both his or her expectations and tactics.

Adjust Your Expectations

Expect a lot of variation.  I mean a lot of variation.  Even with a calibrated, properly maintained, properly operated piece of machinery, we expect a level of variation.  Now take away the calibration, maintenance, and proper operation and see how much variation you get.  Now take away the machine altogether, replace it with a person, and see how much variation you get.  I could go on, but I think I've made my point.  Expect a lot of variation!

Adjust Your Tactics

As for our tactics, we must adjust them to take into account the human factors.  We have to design around the needs of not only the patient, but also the family of the patient.  We might have to make choices we don't want to make to accomodate the teaching needs of an academic hospital.  We have to define value in terms of not only the patient, but also of the payer.  There are so many layers of complexity that prevent us from getting to an optimal future state, but we can't let that stop us from moving towards at least a better future state.  We have to adjust our tactics and be much more agile.

Wednesday, May 11, 2011

Experiments as Nemawashi

Lean folks have heard the term nemawashi.  I've heard it described as preparing the roots of a plant for transport.  It's related to consensus-building, and is especially critical when we are proposing big changes to a process.


I started thinking about nemawashi last week when I was in Six Sigma training.  We were learning about Design of Experiments (DOE), which is a methodical and data-driven approach to testing future-state processes, potential countermeasures, etc.  Immediately, I started to compare and contrast the DOE approach to the less scientific Barn-Raising Kaizen and Quick PDCA approaches that have served me well in the past.  I wondered how we were able to achieve what we did without the rigor that DOE provides.  Then it dawned on me that one of the reasons for our success with these less rigorous and more action-biased approaches was that we were performing a type of nemawashi.

We have all probably seen this formula...


R = Q x A 

...which of course stands for...

  Results = Quality of the Countermeasure x Acceptance Level.

Whenever we test a new countermeasure, we are doing more than collecting data to check the quality of the countermeasure.  We are also impacting the acceptance level for change.  If done right, an experiment can help remove the fear of the unknown, send a message that change is coming, and bring out ideas that don't arise until we see a new process live in action.  These are all symptoms of nemawashi being performed.

Wednesday, May 4, 2011

Small-Batch PDCA

I'm a fan of small batches.  Partially, this is explained by my appreciation of the many fine single-barrel and small-batch bourbons produced in good ol' Kentucky.  But principally, my bias towards small batches is due to the positive impact that batch-size reduction has on process flow, quality, etc.



Normally, we associate batch-size reduction with process improvement.  But if we take a step back and look at our process for conducting process improvement, batch-size reduction is equally as applicable.  Specifically, the way we go about testing countermeasures via PDCA can be enhanced by batch-size reduction.  I call this principle Small-Batch PDCA.

What is Small Batch PDCA?

When we're in the planning phase of of PDCA, we have to decide how many countermeasures we want to test during the current PDCA cycle.  There's a trade-off between the number of countermeasures we test and the amount of time, effort, and resources that will be required to conduct the test.  More countermeasures equals more testing complexity.  In order to properly execute a complex test, we might feel the need to utilize a complex tool such as Design of Experiments (DOE).  My bias is to avoid this testing complexity by testing in smaller batches when possible.

By reducing the complexity involved with carrying out a test, Small-Batch PDCA allows us to compress the lead time from idea generation to idea testing.  This gives us the chance to perform more iterations of PDCA, which in turn gives us a chance to adjust our model more frequently.

Is there a downside to Small-Batch PDCA?

One of the drawbacks of Small-Batch PDCA is that we don't get to test the future-state in a holistic manner, at least not during the first few rounds of testing.  This means that any data we collect early on might not show the dramatic improvement we want, and in fact, it may be impossible to detect any statistically significant changes in performance.  This is a valid concern, but this drawback is partially mitigated by the fact that if we are willing to go to the gemba and observe the test with our own eyes, we don't have to rely on data as much.

Plus, there are some important things that just can't be measured, so we usually need to go to the gemba regardless.  In other words, data isn't everything.  Subjective feedback from those involved with the process can be extremely valuable.  Insights gained from direct observation can also be extremely valuable.  Small-Batch PDCA provides us with most of the feedback we need to effectively carry out process improvements, even if the data is not as perfect as we would like.

Tuesday, May 3, 2011

To Sample or Not to Sample

Image courtesy of Stanford School of Medicine
I'm sitting in Six Sigma Black Belt training this week, learning all about two-sample t-tests, ANOVA, and other statistical analysis techniques.  One thing I noticed is that these techniques are based on sampling.  Basically, you collect data based on a sample, not the whole population.  An example from a hospital would be randomly picking 10 patients from a census of 100 and looking at their infection rates.

Obviously, data from a sample is not as thorough as that of a population, but often it's thorough enough to be statistically reliable.  The benefit of sampling, of course, is that we don't have to go through the time and expense of collecting data for the entire population.  However, thanks to powerful database software available to us in healthcare and pretty much any industry nowadays, we can easily pull all the data for all the patients in our system, at virtually no marginal cost.  This begs the question--why bother with sampling if we already have the population data?

I guess we wouldn't, unless there was some added value in sampling beyond the data that we gather.  If we're sampling by just pulling data out of a database, then there's probably not much value beyond the data.  But, if we're sampling by directly observing a process, then there's a lot of additional value:  we see the process with our own eyes, we get direct feedback from those involved with the process, we often get to directly hear the voice of the customer (the patient), and we get the opportunity to collect data that we didn't even know was relevant by looking at a database.

So, basically, it's not a question of "to sample or not to sample" but "to go & see or to not go & see."

Sunday, May 1, 2011

3 Quick Thoughts on Copycatting Hospitals

Mark Graban over at the Lean Blog got me thinking about the pros and cons of copycatting (using what works somewhere else to fill a need of your own). I'm not talking about plagiarism, intellectual property theft, or anything like that; I'm just talking about one hospital copying the tools and techniques of another, as opposed to coming to solutions independently. Here are three quick thoughts on copycatting:

  1. Copycatting is supported by the "no need to reinvent the wheel" principle, which is logical and intuitive, especially for hospital folks who are busy saving lives and whatnot.
  2. However, copycatting is a barrier to creative thinking and the building of the problem-solving muscles hospital teams need to foster continuous improvement.
  3. Copycatting precludes the emergence of innovative ideas that other hospitals have not thought of yet
One last thought...there's something about copycatting that makes me think we are sometimes too scared of failure.  Maybe our risk aversion prompts us to go with what other hospitals have used, as opposed to trying something new and failing.  Does this mindset stem from traditional management philosophy that encourages us to hide problems?