The Cost of Queues
Lean practitioners and scientists have known for many decades that cost effectiveness isn't all it is cranked up to be. Cost effectiveness is a measure of how much of the capacity of a resource is utilized. The resource may be a person, a computer, an aircraft, a machine in a workshop...
Most people believe that people should be as close to 100% cost effective when they work. That is, their capacity should be utilized 100%. And yet, for other kinds of work, it is well understood that utilizing a resource 100% is a bad idea.
Take your computer for example. If you run one application, the computer is responsive, you can work very fast. With one major application running, your computer uses only a fraction of its capacity. If you decide to run your computer in a more cost effective manner, and utilize its maximum capacity by running a large number of programs at the same time, you will find that the computer slows down and becomes unresponsive. At 100% capacity utilization the computer is frozen. Nothing happens, even though the processor runs full tilt, and the fan is straining to cool the system.
This effect is not specific to your computer. It is a result of how processes work. When capacity utilization of a resource increases, there will be a queue in front of that process. Due to variation in arrival times of new tasks, queues will begin to build long before capacity utilization reaches 100%. The same thing happens in processes where people do the work. (We actually treat machines better than people in many cases. For example, programmers know they should not overload a server, but their project manager often does not know how to protect the programmers from overloading. Nor do the programmers understand the consequences of overloading themselves.)
Queues will increase lead times because tasks will have to spend time waiting in them, rather than being processed by one of the resources in the process.
If you try to become more cost effective by reducing capacity, and thereby capacity cost, all will be well at first, from an economic point of view. (The people who are let go are usually of a different opinion.) The catch is that this will increase the queues in the system. This increases lead times. Consequently, cost associated with lead time will also increase. (In manufacturing there is also a considerable storage cost due to increased inventory, but we will ignore that for the purposes of this blog post.)
A diagram of the relationship looks something like this:
The diagram may of course vary in shape depending on several factors, including how much variation there is in the system. (There are several disciplines, like Six Sigma, that focus on optimizing processes by managing variation.)
What the diagram says is that beyond a certain point, pushing for more cost effectiveness increases total cost. In his book Flow, Donald Reinertsen reports that software developers are often loaded to about 98,5%, which means the increase in queueing cost is far greater than the savings in capacity cost.
Here is a Current Reality Tree showing how overloading people and other resources will affect Return On Investment of a process (Click the diagram to see a larger version):
Most managers have practical experience of projects that just do not get anywhere, or production lines where orders are often delayed. Yet, most do not stop to think about the cause of the problems, and how to fix it. This is a pity, but it is of course also an opportunity for those companies where managers, especially top executives, are seriously interested in solving the problems. If they do, they can leave competitors eating dust.
One would think that most companies already are very competitive, but the truth is that they are not. Many are choking themselves by being to cost effective. this causes more damage to them than their competitors ever will.
Another thing to consider is that large queues slow response times. Today, when the ability to respond quickly is becoming more and more important, not understanding the cost of queues can easily kill a company.
The diagram may of course vary in shape depending on several factors, including how much variation there is in the system. (There are several disciplines, like Six Sigma, that focus on optimizing processes by managing variation.)
What the diagram says is that beyond a certain point, pushing for more cost effectiveness increases total cost. In his book Flow, Donald Reinertsen reports that software developers are often loaded to about 98,5%, which means the increase in queueing cost is far greater than the savings in capacity cost.
Here is a Current Reality Tree showing how overloading people and other resources will affect Return On Investment of a process (Click the diagram to see a larger version):
One would think that most companies already are very competitive, but the truth is that they are not. Many are choking themselves by being to cost effective. this causes more damage to them than their competitors ever will.
Another thing to consider is that large queues slow response times. Today, when the ability to respond quickly is becoming more and more important, not understanding the cost of queues can easily kill a company.
Comments