Batching: A case study
In the previous article, we introduced a formal description of batch picking as well as the general pitfalls associated with this process type. In this portion, we will discuss the general causes for the perhaps counter-intuitive fact that batching typically costs more time and errors than a flow process.
Overhead and setup creep
Batching always costs additional resources, since instead of just completing the full process as it arises, you are now partitioning it into smaller lots or tasks. This necessitates organization and additional steps; which equates to higher resource consumption. This additional steps invariably lead to higher error rates.
Forecasting leads to over or underproduction
Forecasting is a special type of overhead that deserves its own name due to its potentially disastrous consequences. Vineyards are a great example of a business where forecasting can go very wrong.
Our example concerns true French champagne. Since the time to go from growing grape vines to distributing a batch of champagne can be very long (the lead time), the producer must estimate how much demand there will be for the product ahead of time. This is tricky work. Fortunately for producers, champagne does mature. In this example, however, the producers did not foresee that a major PR mistake would crater their demand over the coming decade. Not only this, but their mistake gave a minor competitor enough room to grab a major market share.
Add onto this the fact they became overburdened with supply that year (which, remember, negatively impacts price!). All together, 2006 was a bad year for Cristal.
Errors are compounding with batch processing, and defects affect larger sets of work in progress (WIP)
Errors compound: When I make a mistake somewhere in a batch, it will always affect another part of that batch. This is because all of the materials processed are linked. Using too much of one resource limits the resource for further production. If I try to account for this by obtaining extra resource, I'm wasting time! As in the Second Law of Thermodynamics: you can't win, you can only HOPE to break even!
Defects affect larger sets of work in progress (WIP): With batch processing, the number of defective units created by a defect-inducing process will always be higher.
Enough theory, time for an illustrative case study!
Batch processing in an eCommerce shipping environment
When JerkyXP was initially performing its own fulfillment, it used an order-batch approach. This meant that they attempted to fulfill 100% of their shipping day's orders in one "go". What this looked like practically is described below.
Print shipping labels for all outstanding orders.
Print invoices for all outstanding orders.
(For every outstanding order, repeat:)
1. Select order
2. Match label with invoice
3. Pick goods
4. Double check everything
5. Seal packaging / pack and ship order
The biggest issue I'd like to draw your eyes to is step (4). This process was so error-prone that JerkyXP injected an additional band-aid step into it. The results were still completely unacceptable. This is a symptom of the breakdown of batch processing: we attempt triage. Consider how many potential errors were created by simply printing the invoices and labels at once!
Let's calculate them! For just ten orders, there are 189 ways to get the invoice and shipping label wrong and millions more ways to get the order iteself wrong; yet just ONE way to get it right.
This is perhaps the most damning indictment of batching we've yet discussed.
So, JerkyXP used this order-batch with the presumption that it was faster and more accurate. They found, once switching to a flow method; that each order used 30% less time and was 150x less error prone. Stay tuned to hear more about flow!
You can read the next (and final) article here!