The Gist
The problem: A personalized loyalty perk — 14-day early access to favorite brands — was generating interest but not consistently converting to action. The experience was designed around one ideal path, while real members were following three distinctly different ones.
The approach: A multi-phase research effort: usability testing on the brand selection and notification flows, a diary study tracking real member behavior across the 14-day window, in-store tag-alongs observing what happened when members arrived at the store, and journey mapping across the full end-to-end experience.
The outcome: Research reframed the problem — from "the perk isn't compelling" to "the experience only supports one of the three ways members actually behave." Teams focused on the moments where confidence broke down, including a previously overlooked gap: store associates who were unaware of the perk entirely.
Situation
High interest. Inconsistent follow-through.
A loyalty program introduced a personalized early-access perk: members who selected their favorite brands would receive exclusive first access to discounted items from those brands for 14 days before items were released to the general public. The value proposition was clear and the interest was real.
But teams were seeing signals that the experience wasn't consistently translating that interest into action — particularly across the transition between digital notifications and in-store behaviour. The question wasn't whether the perk itself was valuable. It was why confident intent wasn't becoming confident action.
My role was to understand what was actually happening when members encountered this perk — where the experience was working, where it was breaking down, and what the product and design teams needed to focus on to close that gap.
Diary Study Finding
Three real behaviors. One supported path.
The diary study — which followed members in real time after they received early-access notifications — surfaced a pattern that reframed the entire problem. Members weren't behaving uniformly. There were three consistent and distinct responses to the perk, each with its own logic, its own friction points, and its own need from the experience.
Confident — acted directly in-app
This group received the notification, felt confident in the offer, and completed the purchase without leaving the digital experience. The existing design served them well.
The path the experience was built for — but only one of three.
Interested — needed to see it in person
This group was genuinely interested but wanted to touch, try, or see the item before committing. They arrived in-store with intent — and that's where the experience either held together or fell apart.
Success depended entirely on what happened at the store.
Saw the value — planned to return later
This group recognized the perk but wasn't ready to act immediately. They assumed the 14-day window gave them time — without realising that no reminder or re-engagement existed to bring them back before it closed.
Intent was real. The window closed before they returned.
Key Findings
Where confidence broke down
The problem wasn't the perk — it was confidence at specific moments
Members weren't disinterested. Usability testing and diary data showed consistent intent. What broke down was confidence — about timing, eligibility, what exactly the benefit was, and whether it had actually been applied. Each of these was a moment where the experience could have reinforced trust but didn't.
Store associates were an invisible loyalty touchpoint — and a critical one
In-store tag-alongs revealed that when members arrived with early-access intent, the experience they had depended almost entirely on whether the associate they encountered knew about the perk. When associates understood it, they could locate items, confirm eligibility, and reinforce the value. When they didn't, the experience felt disconnected — and doubt replaced confidence at exactly the moment purchase decisions were made.
The 14-day window created false confidence for deferred buyers
Members who planned to return "later" had genuine intent — but there was no mechanism to bring them back. No reminder, no expiry nudge, no re-engagement. Journey mapping showed the 14-day window working against this group: the window felt generous, so urgency never materialized, and the moment passed.
The app-to-store transition was the highest-risk moment in the journey
Journey mapping identified the handoff from digital notification to physical store as the point where confidence was most likely to drop. Members arrived with intent but without enough context to navigate the store experience independently — and the store wasn't always equipped to pick up where the app left off.
"Loyalty programs succeed or fail not on the strength of the perk itself, but on how confidently customers can act on it."
What Changed
From a single ideal path to a system that supports real behavior
Store associates repositioned as a core loyalty touchpoint — not a downstream detail. The research made clear that the physical store was a critical part of the loyalty experience for store-first members. Associates needed to be equipped, not assumed.
Early-access messaging redesigned for clarity — so members immediately understood what the perk was, what it covered, when it applied, and what to do next. Ambiguity at the notification stage was costing confidence before members even reached the store.
Loyalty value made visible at checkout — so members could see the benefit being applied in real time, rather than wondering whether it had worked. Confirmation at the point of payment turned a question mark into a moment of reinforcement.
Follow-up reminders introduced within the 14-day window — to re-engage deferred buyers before the window closed. Members who planned to return needed a signal that the moment was still available, not silence until it wasn't.
Outcome
The research shifted the team's mental model from "why isn't the perk working?" to "where does confidence break down, and for whom?" That reframe changed which problems got prioritized — and meant that fixes were targeted at the moments that actually determined whether members followed through, rather than the perk itself.