A warehouse’s process quality has large effects on its cost structure. Each defective unit in a process requires re-processing and rework. The effects on the overall labor requirements are additive. You may be surprised by how much!
Process Quality Is Measured By First Pass Yield
First Pass Yield (FPY) is a manufacturing term for the rate of acceptable units produced. Usually it is calculated as (Good + Acceptable) / Total units produced. So if a machine produces 100 units with 97 being overall acceptable, the FPY is 97%.
This also works in warehousing. If a process generates a percentage P of unacceptable units that require rework, then 1-P = the process FPY. For example, if Receiving receives 100 units but generates two errors that have to be reworked, the FPY of Receiving is 98%.
The overall site First Pass Yield can be calculated as product of the FPY of its processes.
So a site with five processes could have an overall FPY of FPY(process1) * FPY(process2) * FPY(process3) * FPY(process4) * FPY(process5).

Aside from questions coming from units-of-measurement differences between processes, this gives an idea how errors can compound through a facility.
Auditing And Sampling Can Determine FPY
As a practical matter, it is difficult for sites to track process quality rates. Most metrics are from customer-facing data, cycle-counting, or – sometimes – pick-specific metrics.
But the way to determine a process’s error rates is simple: Conduct audits, and apply sampling math to determine the confidence level and margin of error.
Short of 100% checks, a site usually won’t be able to immediately identify every defect created. The way to arrive at an idea of FPY for a process is to conduct audits, or sets of inspections, for a process.
For example, have an employee check every third unit on a pack line to ensure the product and packaging are correct, and record any defects. When the employee has completed many checks, the facility can use raw number or sampling math to calculate the error rate of the process.
Rework Affects Processing Load
The book “Factory Physics” points out that a machine must actually be able to process (1+P)% units total demand, or else there will be a backup at the machine. Why? The reworked units from P must eventually go through the same process step. The demand or units don’t just disappear after failing!
Think about a pick short. A picker can’t find a unit. So there’s often another pick if the SKU is available, plus a cycle count to check for inventory accuracy. Or the customer gets shorted on the order, which is also costly. So there is 1) repeat of the processing step, plus 2) rework.
While it may not always be so clear-cut as in this example, there is always rework and another touch on the item to make it go forward to the next processing step
Together, this has important implications for labor cost in warehouses.
How Process Quality Affects Labor Requirements
Rework plus re-processing will increase labor requirements in a warehouse. This adds up to a minimum addition of double the labor required to process units from P% error rate. For even modest process error rates, this can have high impacts on site labor budgets.
Here’s a rework labor planning example:
1) Remember, the actual required throughput is (1+P)% of planned units. Your demand plan might say to receive 5000 units daily, but you should plan labor for (1+P)*5000 units. If this isn’t planned, then there will be a buildup of work to process later. Do you know your P?
2) We haven’t talked about rework effort yet. The rework plan would be labor for P * 5000 units of rework. And rework is often a slower, less efficient process to complete.
3) Now we see impacts at the site level with some simplifying assumptions: Assume 5 processes, Receiving, Putaway, Picking, Packing, Stage / Load, each handling equal rates of throughput.
In this simplified example, you need to plan direct labor to process extra to your planned volume PLUS rework labor for each of your processes. That’s an extra on top of the planned number – a very unpleasant surprise!
Since rework labor is often slower than normal processing, overall efficiency is going to be less than regular processing.
Here’s a numerical example, assuming 5k units volume at steady-state, reasonable defect rates, and some assumed costs:
You need to plan 7% more labor and you’ll operate at 93% efficiency, right off the bat!
Is this realistic? You be the judge. Here’s from the WERC 2023 DC Measures report, showing quintile performance metrics across industry:
Median Inventory accuracy is 98%, and median order-picking accuracy is 99%.
So 1-2% of process error at each step is not unrealistic for many sites.
Yes, your site numbers may not be as extreme. And finding FPY at a process level can be tough. But it’s easy to see where labor can leak out of quality gaps, and why taking the effort to complete audits can help drive massive financial results.
Calculating the Cost
Here is a formula that can go into Excel to determine cost of rework:
Where:
- VolP = Volume of process P
- Rd = Rate of Defects
- Pn = Process N
- CPUP = Cost Per Unit of process P
- CPUReworkP = Cost per Unit of rework in process P
While this is a good guide for sizing, it may not account for all the real-world processes.
Real-World Complications
The above example is simple compared to the real world. A real-world analysis would include all sequential processes, processing times, their error rate assumptions, and applying the resulting unit costs across rework and processing.
One big complication is figuring out what units of measure to record defects in. If a case is incorrectly stored in a broken-case picking area, is that one error (the case) or 100 errors (every each in the case which may be mis-picked)? How should you track the defect rate? This is important for final handling numbers.
In warehousing, the FPY of a process is harder to observe and measure than in a factory. Errors are not always identified immediately, and could be found hours, days, or weeks after they’re made.
For example, a receiving mis-labeling error might not be identified until the picker has picked an incorrect item and it is passed to packing, where it is caught at verification. While the overall cost to re-process remains similar to what’s described above, the time between error and effect can be much larger than in a manufacturing setting.
This also means that auditing must capture data not only about the error, but also enable labeling data about the origin and cause of the error. It does no good to attribute a receiving error to the picking department.
Further, rework flows can be complicated. When the receiving error caught at packing is reworked downstream of packing, how should we think about the rework for packing vs for receiving? The attribution of problem-solving labor can only be done if the individual errors are tracked.
Conclusion: Quality Drives Significant Labor Cost
“Safety, Quality, Cost.” That’s the operator mantra.
Your site’s specific volume, error rates, and rework effort will drive your numbers and impacts. But diving deep into process quality and targeting improvements can have huge impact on real dollars.
Sites can do this by increasing problem-solving labor until they get to a steady-state of flow. But this is a reactive measure.
The proactive way to track error rates is through tracking and audits. Audit results will help identify FPY rates and error causes. You can use those to eliminate errors and maintain process quality.
Even if attribution is difficult, it is much better to have a picture that is fuzzy around the edges than no picture at all.
To help with finding your error rates, check out QualVis Audit App, or contact PL Programs for our engineering and project management programs.