A Costly Myth - The Life of Dr. Lloyd Nelson
A company may appear to be doing well, on the basis of visible figures, yet going down the tube for failure of the management to take heed of figures unknown and unknowable. Edwards Deming, Out of the Crisis
The famous quote, "If you can't measure it, you can't manage it," is usually attributed to Edwards Deming, but it is incorrect. Deming referred to this as a costly myth. In his book Out of the Crisis, Deming argues that the most important figures are unknown and unknowable. A reader of this book will find applications of Dr. Nelson's pronouncements on nearly every page. It is Dr. Lloyd Nelson who is being referred to here. Dr. Nelson is the source of most of these costly myth ideas.
In the 1960s, Nelson founded the Journal of Quality Technology at the American Society for Quality (ASQ). Deming learned about Nelson at this time. Nelson was born in Connecticut in 1922. In 1943, he graduated from the University of North Carolina. Upon graduation, Nelson served in the U.S. military from 1944 to 1946. In 1950, he received a Ph.D. in inorganic chemistry from the University of Connecticut. Nelson became fascinated with quality control and statistics after teaching for several years. In the following years, he dedicated his professional life to General Electric.
General Electric was renowned for its quality and innovation before the "Miracle in Japan" and Japan's quality revolution. Nelson was a consultant statistician for the General Electric Lamp Division. A few years later, he became General Electric's Appliance Division's Applied Mathematics Laboratory manager. The following products were produced while Dr. Nelson was at the Louisville plant.
1955 - Pluggable air conditioners become available.
1956 - First toaster oven: The T-93 Toast-R-Oven.
1958 - The first automatic electric can-opener.
1963 - First self-cleaning oven: the P-7.
1978 - First over-the-range microwave oven
In 1992, Nelson joined Nasuaha Corporation as Director of Statistical Methods at Dr. Deming's recommendation. Nashua was the company mentioned in "If Japan Can, Why Can't We"), where Deming worked with Bill Conway. Deming dedicates a fair amount of time to Lloyd S. Nelson's guidance in Out of the Crisis. In addition to studying Walter Shewhart and Dr. Deming, Nelson believed that analytical statistics is necessary to understand the unknown and unknowable. Leaders must understand and extract meaning from variation to manage an organization.. This is referred to as the Theory of Variation by Dr. Deming. A key idea is that normal and abnormal behavior patterns can be determined using statistical tools. According to Deming and Nelson, this is common cause variation and special cause variation. Understanding the unknown and unknowable can be done by studying the causes of variation.
Knight Capital is a classic example of an unknown. Knight was the largest trader on the NYSE and Nasdaq in 2012. Knight's Electronic Trading Group executed a daily average of 3.3 billion trades worth more than $21 billion. A mistake was made while deploying high-frequency trading code to an eight-server cluster. Only seven servers were updated. The eighth non-updated server ran old testing code that caused 440 million trades to be made in 45 minutes due to erroneous trades. It was a successful deployment; according to Knight Capital, they didn’t know that the eighth server didn’t get updated with the new code. No one at Knight Capital understood variation. It certainly wasn't reflected in their business practices. I have spoken with several IT professionals at high-frequency traders like Knight Capital. All of them say that a glitch like this can occur. Few think it would have taken forty-five minutes to stop. Why? Because they watch for common-cause and special-cause variation.
IT operations of large scale often exhibit variation. The process of understanding variation is known as statistical process control. The idea of common cause variation is also known as the voice of the process. This is the variation that will always be present in a specific process. An excellent example of the voice of the process is if you measured a hundred throws of an expert dart player, they would not always hit the center of the dartboard, but almost every throw would fall within the double ring and bullseye circles. Trying to improve a process that behaves normally usually will result in abnormal conditions; as I've been quoted, "Misunderstanding variation is the root of all evil." Lloyd Nelson said it better:
"In the state of statistical control, action initiated on appearance of a defect will be ineffective and will cause more trouble. What is needed is improvement of the process, by reduction of variation, or by change of level, or both. "
Over the years, quality experts have identified different patterns of these special cause variations. Dr. Nelson is responsible for documenting eight patterns of special cause variation known as the "Eight Nelson Rules."
Another one of Lloyd Nelson's most significant contributions is the Funnel Game. A hands-on exercise created by Lloyd Nelson and popularized by Deming, the Funnel Game demonstrates the patterns of variation. It is often used to illustrate the danger of tampering with a system without understanding its underlying statistical properties. It is a powerful teaching tool for quality control, management, and systems thinking concepts.
The Funnel Game Setup
In the experiment, a funnel is suspended above a surface, aiming to drop a marble through the funnel to land on a specific target on the surface below. The location of the dropped marble is marked on the surface. The funnel is stationary at the start of the experiment, but adjustments are made in the subsequent rounds based on different rules or strategies.
The Rules:
Four rules are commonly used in the funnel experiment:
Rule 1: In this example, the marble is dropped, and no tampering occurs—only the points where the marble hits are recorded. A simulated control chart of 50 marble drops shows common cause variation or the voice of the process. No points would be considered special cause variation. This is a non-tampered system. All systems are subject to inherent variation. As stated earlier, the greatest dart player will not hit the bullseye 50 times in a row.
Rule 2: Let's say the manager is unsatisfied with the variation of those 50 drops from Rule 1. They demand that all drops hit the exact mark, for instance. To compensate (i.e., tamper with the funnel), they mark each drop and measure the difference between the first and subsequent drops. Tampering is evident in these adjustments. A radius of 50 drops simulated using Rule 2 will show that it is about 40% worse than the manager wanted because of the tampering. A real-world example of Rule 2 tampering is reacting to a customer complaint, or a survey, or adjusting a work schedule before checking if the system is stable. After a breach, the CEO of one bank demanded that they hire 50 cyber experts immediately. Before they even understood why the breach occurred.
Rule 3: Things get significantly worse under Rule 3. In this case, the funnel is repositioned based on the distance between the dropped marble and the target. A simulation of 50 marble drops for Rule 3 shows the marked drop in an uneven scattering. Goldratt's Theory of Constraints provides the best example of this type of tampering. The Theory of Constraints postulates that when alleviating a bottleneck, one must first identify the constraint by looking for the process step that takes the longest time. Once identified, it's essential to address and elevate it by increasing capacity or reducing demand. Goldratt would describe Rule 3 as a local optimization. Local optimization will reduce global optimization.
Rule 4: Rule 4 is sometimes called the Milky Way rule. The more you tamper with it, the farther you will get from the target. You adjust the funnel to the last position every time you drop the marble. The "telephone game" is an excellent example of Rule 4. Each time someone passes on the information, it mutates farther from the original content. Examples of Rule 4 include organizations that have inadvertently institutionalized tribal knowledge. Most knowledge in these kinds of organizations is tacit, so explicit knowledge becomes a game of training the trainer. Brent's experience in the Phoenix Project and its impact on Parts Unlimited exemplifies Rule 4.
The Funnel Game Lessons
The Funnel Experiment illustrates the counterintuitive nature of variation and the problems associated with "tampering," which is adjusting a system in response to natural variations. Rule 1, where no adjustments are made, usually results in a random scattering of points around the target, illustrating the natural variation in the system. Rules 2, 3, and 4, where adjustments are made based on previous errors, often lead to a broader distribution of points, sometimes creating a systematic drift away from the target. This illustrates that well-intentioned adjustments can worsen performance if they are responses to natural variation rather than actual 'signals' or meaningful changes in the system. By understanding the outcomes of each rule, managers and practitioners can gain insights into the importance of distinguishing between natural variation and factual issues that require action, thus avoiding unnecessary changes that can negatively affect a system.
Most of the stories about Dr. Deming are about how he influenced industries and many thought leaders. However, many profoundly influenced Deming, and Dr. Llyod Nelson was one of those people.