As the lead product designer on a national grocery chain’s operations app, I had an opportunity to tackle the issue of produce shrink (loss to spoilage or mishandling) that cuts into company profits. Teaming up with business partners, department leads, and produce clerks, I helped raise forecast accuracy by 12% and saved $1.6 million in lost profits in the first nine months.
Forecasts for produce conditioning was managed through paper guides printed by produce department leads at the beginning of each week.
The problem with the paper guides was the data used to forecast the conditioning amounts was typically outdated and lacked visibility into current store conditions, leaving clerks second-guessing the numbers. Also, store leaders had little visibility into how well conditioning tasks were being executed outside of micromanaging the process or relying on manual audits to enforce best practices.
Veterans clerks knew better than to trust the guide and would do the math, but rookie clerks following the guide were often left wondering at the holes on their shelves or why so much of their produce went bad.
Our challenge: capture what veterans know and use that knowledge to improve the accuracy of the algorithm smarter while giving store leaders more visibility into the state of the conditioning process.
I worked with the product manger and our business partners to evaluate 3 vendors looking at cost, process alignment, and security. In the end, each of these products either didn't meet our business requirements, weren't cost-effective, or would not align with the current conditioning workflow.
With a better understanding of how other products attempted to solve the problem, we moved forward with building a custom app. Our MVP would capture the adjustments with no initial analytics or reports, and would be tested in stores with clear results by the end of 8 weeks.
During discovery, I led cross-functional problem-framing and alignment workshops with our business partners to help clarify the different ways shrink affects stores and eats at profits. Together, we defined the primary business goals for the solution, as well as, the constraints we were up against.
Capture what clerks know and use the info to improve the forecasting algorithm by 5%.
Clerks start and stop the task often and need to resume the work quickly and easily.
Capture store data with minimal taps to limit adding additional time to task completion.
Business leadership wanted the MVP for the solution tested in stores within 8 weeks.
The solution needed to capture and show task performance for the week at a glance.
New design tools need to empower clerks to work across department with zero training.
To get our users perspective, I visited three stores, arriving at 5 AM (or close to it) each time to observe how produce clerks complete the conditioning process. I saw how the clerks, consistently checked their watch or phone, rushing to get as much done as they could before the store opened and they would have to deal with customers. This was my first indication that alongside our primary goal of improving data quality, the goal to design the solution with minimal inputs was just as important if we wanted any hope of achieving company-wide adoption.
With a clearer understanding of the conditioning process, I created a low-fidelity sketch of the primary screens where I envisioned the majority of the work would be completed. I then went back to the stores to share the sketch and get clerks' reactions.
The overall vibe of the feedback was positive, but both clerks and department leads pointed out some things that needed to be addressed:
Based on that feedback, I refined the concept to include historical data and added a swipe-to-confirm interaction. The next iteration showed the full workflow and would provide historical context for the task and further minimize data entry, which would likely make or break tool adoption .
The sentiment stayed positive overall, but associates continued to voice concerns about the time it would take to use the tool.
I built a clickable prototype next to test the solution in stores. In the process, the data-science team suggested capturing the reason for making a conditioning adjustment to further train the algorithm. After working hard to minimize the number of taps needed, the addition of even a single tap seemed unthinkable. But the data-science team seemed convinced that without that information, we may not hit our goal of improving data accuracy by 7.5%. And after some quick adjustments, added an "Adjustment Reason" selector to the prototype and went back to the stores for testing.
The test itself was simple. I asked the produce clerks at each store to condition three conventional items and three organic items from each activity type, first using using the paper guide and then the digital guide, all while I timed them.
The test was successful showing that using the digital guide added no significant time to the overall conditioning process and may eventually prove faster once clerks got used to the tool. Although inputting an adjustment required a couple more seconds than writing the number down in the paper guide, the increase was offset by how quickly clerks could confirm the items not requiring adjustment, which in theory would grow as the forecasts became more accurate.
The final design combined "smart" conditioning targets, minimal-tap actions, and a historical snapshot into a comprehensive produce conditioning tool. Testing the solution showed that produce clerks could log changes and feed richer data to the algorithm as fast as writing on paper, helping users to trust the conditioning data again.
"Seeing the task progress throughout the morning is a big help to figure out where people need to focus. I also like seeing the reason why adjustments were made. It helps with sustainment checks and makes it easier to think through my orders for the upcoming week."
Produce Department Lead
"I didn't like it [the app] at first. The paper guide seemed much faster. Once I got used to it though, I actually like it a lot. It takes about the same time [to use], but it makes the sustainment checks so much easier. And I actually hit my shrink goal the last two weeks."
Produce Clerk
Grouped items by activity (rinse, trim & rinse, trim & soak) to eliminate guesswork and speed up decision-making.
One button to confirm standard conditioning, prevents multi-step inputs for the majority of produce items.
Added task progress to the app home screen so department leads can clearly identify when a clerk needs help.
Offering video tutorials directly in the task details improves the accuracy of task execution and reduces training time.
The best interface is sometimes simply showing accurate data within the appropriate context.
Even experienced users will adopt new tools if the solution provides value and respects their expertise.
Adding mere seconds to a highly repetitive task can have major operational impacts over time.
Users often have critical knowledge that algorithms will never know unless you ask them for it.
Weekly printouts used stale data to produce mediocre results, frustrating veteran clerks and confusing those less experienced. The digital conditioning guide changed all that and delivered value for both associates in the store and our business partners.