Solved

Combining user research insights with Heap analytics

  • 4 November 2022
  • 4 replies
  • 128 views

Badge +1

Hey folks, 

We’re just getting Heap up and running at my current company, and as a user researcher I’d love to begin using Heap to increase confidence in insights derived from small sample qualitative research (interviews, usability tests, etc.).  Does anyone have a case study or two of combining qual research findings with analytics to either bolster, or refute, or learn additional insights? 

Thanks, 

Andrew 

icon

Best answer by Monique 8 November 2022, 01:59

View original

4 replies

Badge +1

Hi Andrew,

Great post / question and something we’ve certainly been noodling on at Heap. I’m the Customer Success Manager for SecurityScorecard, and I’d love to chat further, so I can get a better sense of what parts of your environment you’ve been testing out in your research and how you can hop into Heap to validate your findings. This is a great use case for both our analytics and replay capabilities.

Would you be open to a 30 minute meeting? You can use my Calendly to schedule around avails.

cheers,

Cathal

Badge +1

Hi Andrew,

Product designer at Heap here 👋 This is definitely something we do regularly with our own research!

One way we’ve used Heap alongside qual research is at the very beginning of the process before even conducting the research. By being able to understand how folks are using the product, we can use Heap to create a much more targeted list of users that are doing certain behavior. This makes our research more targeted, often with higher response rates, and allows us to dive more deeply into specific user segments.

How they’re connected after the research is conclude varies quite a bit. For more usability focused research, Heap makes it pretty easy to confirm how broadly something we saw applies to our whole user base. For more foundational, open-ended research, it’s definitely harder just because that is less immediately connected to current user behavior. However, one thing that has been valuable for us is creating funnels or journeys that map to the product area we’re trying to learn about. Journeys especially give us a sense of some of the paths/process that we may have not even been thinking about. Seeing individual session replays of users taking these journeys helps us bridge the gap between purely quant and purely qual too!

Badge +1

Hi Andrew,

I’m Research Director here at Heap. We recently used Heap analytics to validate strong hypotheses generated from our research with potential users.

In this example, we actually triangulated our data sources. We used New User NPS response comments, a week of interviews + prototype usability tests, and Session Replay analysis + Data Analytics served up from Heap.

First, we synthesized and coded NPS verbatims by theme to form hypotheses about how new users first interacted with Heap: things like pain points, what worked well, gaps, etc.

We then brainstormed prototype ideas to leverage some of these hypotheses and solve for the pain points.

Heap analytics data and Session Replay recordings helped us understand how different segments of new users behaved. (This wasn’t something we could get from NPS analysis).

We noticed one segment tended to spend their time in Dashboards & Charts made by pre-existing users, while the other key segment tended to create new things on their own. Like Bryan mentioned, funnels and journeys were the keys to understanding friction, drop-off, and variation between the 2 groups’ behavior. We baked potential solutions for these “ahas” our analytics unearthed into our prototypes.

Heap’s behavioral data also informed our recruitment for the upcoming week of validation research. We needed to ensure we had representation from these 2 segments we learned about in addition to our other criteria. We validated our hypotheses/prototypes with a week’s worth of interviews + usability studies with potential users.

We witnessed key behaviors from our participants and heard expected outcomes from them when using the prototypes. We’re currently experimenting with some of the “winning” solutions and collecting live analytics data to confirm what we saw and what participants thought they’d do.

For example, we’re measuring the conversion of users from “the new page” to doing an action that’s one of our success metrics. We’ll continue to track so we can further assess adoption and impact.

Apologies for the long response; I hope this concrete example helped. 

Monique


 

Badge +1

Thanks Monique (and others who were kind enough to chime in), this info and the specific case study are very helpful!  

Reply