We broke a core metric and all we got was this lousy sense of accomplishment

Drew Dillon
AnyPerk Product & Engineering
6 min readAug 25, 2016

--

The AnyPerk Perks product, as you might glean from the website, is a funny beast. One part e-commerce experience, two parts benefit that HR purchases for their employees.

Like a consumer e-commerce product, we track every part of the purchase funnel. But, unlike a consumer e-commerce product, we don’t make money from actual transactions. We track purchases so we can report usage back to the wonderful HR teams that buy our product to make their employees feel happier and more appreciated.

One metric we track is clickthrough, literally how many people click a button expressing interest in a Perk. Once clicked, for the purposes of this discussion, one of two things used to happen:

  1. The user would go to a Shopify page, owned by us, for purchase and fulfillment.
  2. The user would go to the vendor site for purchase.

The differences between the two have some bearing on our metrics. In the case of #1, we understand every part of that funnel for every user. For #2, we don’t control the rest of the purchasing experience, so we work with our partners to look at conversion rates through the rest of the funnel and provide an estimate to customers that is as scientifically accurate as possible.

In analysis we want to make sure we put a big gold star on facts. We know, for a fact, how many people are clicking that “Get Perk” button, and can use that as a point of comparison for all perks.

Clickthrough is, for us, a “core” metric. Something we track constantly, something I report to the board. In other words, you do not break clickthrough.

So We Broke Clickthrough

Now, it just so happens, we wanted to move off of Shopify. Nothing against the good people of Shopify and their product, which has served AnyPerk well for years. Our goal was to integrate purchasing directly into the AnyPerk site in order to:

  1. Remove a step in the purchasing experience
  2. Enable purchasing with credit from our newer Rewards product

For this move, we’d chosen the open source Spree gem and have slowly been moving everything over to our own storefront. As you can probably tell from the pre-amble, the result was… bad……..

August isn’t over yet, but estimates still put us 10% off July

WTF Just Happened?

This feature required fundamental architectural as well as broad content changes, which made A/B testing untenable for a company of our size. So we had to just go for it and hope for the best.

despite papa bear’s enthusiam, not the ideal release strategy

So we didn’t have the ability to predict this and were stuck correlating after the fact.

What Do We Know?

When one of your facts is sick, get the rest of them together and make a data care package. Let’s look at what we do know.

  1. User Confirmation — nothing funny here. Invited users map to Sales activity and confirmation rates seem in line with previous months.
  2. Active Users — MAU is on a four month tear right now. August has been no different, we’re about to beat July with a week left to go in the month.
  3. Stickiness — WAU/MAU is flat to slightly up. A record-breaking tear of MAUs could drive this down, silly math, but the fact that it’s flat to up is very good.
  4. $ Redemption — dollar redemption, for the perks/rewards we fulfill, is already higher than any previous month.

So, all-in-all, the state of the union is strong. What’s changed functionally that could have impacted clickthrough?

Hypothesis 1: Copy/Action Change

The button on most perks:

The button on these new fulfillable perks/rewards:

Add to cart is a heavier action, a bigger commitment. Could that be driving down clickthrough?

Maybe, but I’m not sure. It’s not necessarily that much stronger of an action and my previous experience with these smaller copy changes is that their impact ranges in the 2–3% range over time, 10% in a month seems a little crazy.

Hypothesis 2: Funnel Shenanigans

Even within the formerly-Shopify perks, there was a pretty wide variance in conversion from clickthrough to purchase. Focused experiential perks converted at a very high rate (ski lift tickets, movie tickets, etc.), while retail/fashion perks (watches, sunglasses, etc.) had a very low conversion rate.

A few of my theories on why this happened:

  1. Choice psychosis — the “get perk” button click historically transitioned users to Shopify. In the case of movie tickets, the number of choices didn’t change between our site and Shopify. For a given sunglass brand, the number of choices went up by a factor of 10. It’s how we’re used to browsing on sites like Amazon, but it wouldn’t surprise me if more choice actually hurts conversion.
  2. Lifestyle fit— it’s relatively easy to click a button expressing interest in a brand, who doesn’t like the idea of browsing high-end sunglasses? But it’s a different matter when you see a wall of the actual sunglasses. Then you start thinking about how they’d match your wardrobe, how often you’ll wear them, etc.
  3. Shift from low to high consideration — relatedly, that button click is free. Even with our hefty discounts, fashionable watches are far from free. The confounding variable here is lift tickets, which are still expensive, but convert highly (see #1).

All this to say, the shift from brand interest to product choice seems likely to fire users’ system 2. System 2 is finite, so your brain will protect it by saying, “Nah, I’m not gonna spend time on that right now.”

What does this have to do with the present state?

Well, if you can imagine the funnel of the old world, it probably looked like this:

not actual conversion rates

Post-move, our funnel looks more like this:

also, not actual conversion rates

Can you spot the difference?

Right, we removed the “get perk” button click as a part of the process of expressing interest in a brand.

How can we feel better about this correlation?

  1. Removing funnel steps is usually a good thing and should increase overall conversion. Remember back to our data care package, redemption this month is at a record high.
  2. This would, I assume, be a better experience for users. Users who have a better experience are more likely to come back. This enthusiasm should reflect itself in an eventual increase in “stickiness” WAU/MAU, as also noted above.

You should never feel great about correlation, but this one does tend to fit the available data and makes intuitive sense. So it will serve until we find a better theory or counterfactual data.

What Comes Next?

Good question! You’re so smart.

We trashed a core metric, but boosted user (and, by proxy, our HR customer) enthusiasm. That’s a trade I’ll take any day of the week.

I thought our models were already pretty smart, but now they’ll just have to get smarter. Game on.

You Can Help!

We have a bit of a challenge: we’re a small team, and making our models smarterer is one of many things we need to tackle. If you read all the way through this, you might just be as big of a data dork as we are.

We’re hiring for people like you and would love to talk to you or like-minded friends of yours.

Please share this article, those job postings, and/or apply today!

--

--