Proof of Concept Test
Collaborating with Unlikely Partners to Make Data Driven Recommendations
01
Opportunity
Small boxes are causing a lot of problems in the Vitacost fulfillment centers, decreasing warehouse efficiency.
02
Solution
A proof of concept test to determine feasibility of 2 proposed workflows to increase sort speed and reduce waste
03
Result
A qualitative and quantitative supported recommendation that prevented a non justifiable $550,000 increase in warehouse labor
Small Boxes, Big Problems
Small and extra-small boxes constantly tip over, disrupting conveyor operations and risking inventory loss. In a fast-paced fulfillment center, these frequent slowdowns didn’t just hurt efficiency—they set off literal and figurative alarm sirens.

Understanding Stakeholders’ Motivations
Collaborators: System Analysts, FC Site Leads, FC Department Leads
Stakeholders tasked us with designing a workflow to remove small boxes from the conveyor line. To uncover the root problem and align on goals, I led a two-part workshop. Our goal wasn’t to implement their exact workflow but to understand their motivations and concerns.
Key Insights:
✅ Cost Savings Potential – Eliminating small boxes could reduce waste and improve visibility for bagged shipments.
❌ Mixed Leadership Buy-in – Some stakeholders questioned feasibility and ROI.
📦 Space Constraints – Warehouse layouts varied, limiting available square footage.
👷 Labor Shift – The new workflow reduced picking labor but increased packing labor—critical in a labor-heavy environment.
Two Approaches, Same Goal
Collaborators: Analysts, Backend Developers
Leadership proposed a workflow, but we explored an alternative. Both approaches removed small boxes from the conveyor line but differed in their sorting methods:
✔️ Sort by Location – Items were sorted purely by placement, ignoring individual orders.
🔄 Sort by Order – Items stayed grouped by their final destination.
Defining Success: Metrics That Matter
Collaborator: Process Automation Engineer
With biases on both sides, we partnered with a process automation engineer to establish meaningful success metrics:
Quantitative Metrics:
⏳ Sort Time – How long does it take to sort items?
📊 Associate Performance – How does experience impact speed?
🛠 Indirect Labor – How much additional labor is required?
Qualitative Observations:
🔧 Station Configuration – What setup works best?
💪 Physical Strain – How does the process impact fatigue?
🖥 Interface Feedback – Is the QC app intuitive?
🗣 Associate Sentiment – Which workflow do associates prefer?
Building a Realistic Prototype
Our team worked primarily in .NET, limiting flexibility but ensuring the prototype felt familiar. These constraints highlighted:
1️⃣ Consistency for Associates – Aligning with existing workflows reduced friction.
2️⃣ A Case for Change – Demonstrating .NET’s limitations strengthened the argument for transitioning to REACT.
We knew a prototype would save dev time while still allowing responsive testing. To set expectations:
✅ What We Could Fake:
✔️ Familiar UI & Interactions
✔️ Realistic Orders & Response Times
✔️ Item Scanning for Reliable Data
❌ What We Couldn’t Fake:
✖️ Captured Errors in Real-Time
✖️ Full Sort Complexity
Testing an Interface for Two
This proof of concept (POC) had an extra challenge—it needed to work for both:
👷 Warehouse Associates – Needed clear, touch-friendly interactions and realistic item scanning.
🖥 OpsDev Team Members – Acted as the “driver,” triggering prototype interactions to mimic real-time actions.
Aligning Design with Usability Best Practices
We designed the prototype with Nielsen Norman usability heuristics in mind:
📌 Clear Directions (Help & Documentation + Real-World Match) – Integrated guidance and visual cues helped associates navigate seamlessly.
📦 Real-World Affordances (Visibility of System Status) – Items had clear drop zones and feedback on placement.
⏪ Undo Options (User Control & Freedom) – Allowed prototype drivers to correct accidental scans without disrupting testing.
📡 Scanned Feedback (Visibility of System Status) – Click-triggered scans mimicked real-time system responses.



Validating with Concept & Usability Testing
We tested workflows with packing site leads for feasibility insights, then ran usability testing with a department lead and developer to refine the interface and ensure seamless interactions.
A Collaborative Solution to Capture Metrics
To capture accurate sort times between workflows, we leveraged an existing barcode workaround—proving that even simple solutions can drive meaningful insights. By introducing a supporting player to mimic associates’ scans and log times in an Excel sheet, we were able to track not just direct labor (sorting time) but also indirect labor (box setup and other prep tasks). This method provided a clearer picture of efficiency across workflows, reinforcing the need for design enhancements that streamline both sorting and setup processes.
Final Verdict: Not Worth the Squeeze 🍊
As our automation engineer crunched the numbers, I synthesized observations and interviews. The findings?
📊 Sort speeds were similar across both workflows.
🧑🏭 Experience level impacted Sort by Order more, making training harder.
📦 Sort by Parcel struggled with tote density, slowing down efficiency.
Despite potential benefits like reduced waste and increased visibility, no data supported these gains outweighing the labor cost increase.
In other words? The juice wasn’t worth the squeeze. 🍊