Colette Kolenda and Kristie Savage talk about how a three-step course of for mixing qualitative and quantitative strategies can keep away from information discrepancies and gas product selections.
Mixed Methods Research at Spotify
At Spotify, our strategy to insights is grounded in our perception in making use of a number of complementary analysis methodologies. Our analysis workforce, Product Insights, is made up of each User Researchers and Data Scientists.
As we are able to see from visualizing our methodologies on this “What-Why Framework,” User Researchers and Data Scientists are pure companions. Data Scientists take a look at the large-scale, overarching traits in person conduct by strategies corresponding to A/B exams and statistical modeling. User Researchers apply strategies corresponding to interviews and surveys, to discover the self-reported listener expertise to know the psychological fashions and perceptions of Spotify.
Together, User Research and Data Science present complementary views that mutually improve one another. By combining these disciplines, we are able to acquire a holistic understanding of a number of types of information and mitigate the blind spots of a single analysis methodology alone.
When your information says one factor however your customers say one other
What occurs when the totally different strategies you use yield totally different, even contradictory outcomes? Well, this occurred to our workforce once we have been researching a check of skippable advertisements in Australia. In this check, Spotify Free listeners may skip the audio and video advertisements that come between their songs.
As a Data Scientist, Kristie ran an A/B check the place the experimental group acquired the skippable advertisements function and the management group had a traditional Spotify advert expertise. She recognized three teams of customers with totally different ranges of engagement with this new function: energy skippers, medium skippers, and those that by no means skipped a single advert. To higher perceive why every group used the function as they did, User Research adopted up with interviewing listeners from every group.
We have been stunned to be taught that the A/B check information mentioned one factor, however our customers within the interviews mentioned one other. Some listeners we labeled as “power skippers” from our A/B check truly had some confusion round which advertisements they may skip. These confused listeners doubtless wouldn’t have labeled themselves as energy skippers, as our information did.
Mental fashions are one thing we may solely be taught from qualitative information from User Research. What we thought was the reality from an A/B check was truly an incomplete story. This distinction between what our information confirmed and what our listeners mentioned was arduous to reconcile.
This discrepancy was found as a result of we have been passing learnings between teammates: from Data Science with the A/B check to User Research with the person interviews. This is the everyday method of working collectively, however in actuality, the 2 disciplines weren’t really collaborating. Although we used complementary strategies on this mission, they yielded contradictory insights.
We wanted to take collaboration to the following stage, to concurrently triangulate User Research and Data Science by pointing totally different methodologies, on the similar group of listeners, on the similar time.
Three steps for mixing strategies successfully: simultaneous triangulation.
We delineated a course of to extra successfully combine strategies. We name it “simultaneous triangulation.” It generates complete insights with out complicated discrepancies:
Step 1: Hone your analysis questions.
Clearly defining the analysis targets makes it simpler to determine alternatives to collaborate.
Step 2: Mix strategies in several quadrants of the “What-Why Framework.”
Find complementary strategies in several quadrants to counterbalance the strengths and weaknesses of every.
Step 3: Implement strategies concurrently to yield complete insights.
Rather than deploying every methodology individually, design your research to have all strategies level on the similar group of customers on the similar time. This will mitigate dangers for unexplainable discrepancies.
What this appears to be like like in observe:
Step 1: Honing our analysis questions.
Despite some discrepancies within the first spherical of analysis, we nonetheless acquired helpful learnings about points that have been stopping listeners from totally adopting the function.
With the second spherical of this analysis, we needed to uncover all the drivers and blockers for function utilization to know the ‘why’ behind the advert skip conduct. Building off of the earlier learnings, we crafted particular analysis questions for spherical two that may yield a deeper understanding of these focus areas: consciousness, understanding, and value of skippable advertisements.
Step 2: Mixing strategies in several quadrants of the “What-Why Framework.”
To holistically perceive these analysis questions, we mixed a diary research with information monitoring. Diary research are within the qualitative-attitudinal quadrant of the “What-Why Framework” whereas information monitoring is within the quantitative and behavioral quadrant.
These two strategies present complementary views. In the diary research, we recruited members to participate in a three-week research, the place we requested them to inform us about their day by day listening experiences and their reactions to the advertisements. For the information monitoring, we requested for every participant’s consent to have a look at their behavioral log information that was pertinent to the listening periods. We have been in a position to perceive their general expertise from the behavioral information and their perceived expertise by the diary research.
Step 3: Implementing strategies concurrently to yield complete insights.
The wealthy understanding of our customers’ experiences got here from pointing each strategies on the similar group of listeners on the similar time. While members crammed out the diary research entries, we may additionally test the dashboard to evaluate their behavioral information expertise. For every diary research entry, we not solely acquired a way of their self-reported expertise of the place and the way they listened and their response to the advertisements, however we additionally may perceive the behavioral information expertise – corresponding to how lengthy they listened for, what number of advertisements they acquired, and which advertisements they skipped.
One methodology alone would have fallen brief. Without the information dashboard, we’d have been blind to their behavioral expertise. And with out the diary research, we’d not have had perception into their perceived expertise.
This time, discrepancies have been attention-grabbing follow-ups, not lifeless ends
In this new research design, discrepancies between the behavioral information and the diary research have been truly actually helpful insights! If we noticed an attention-grabbing pattern in listeners’ information, we may comply with up with them within the diary research to be taught extra concerning the ‘why’ behind their conduct.
For instance, Kristie seen that one participant solely ever skipped a most of six advertisements in a day. No matter what number of he acquired, he would by no means skip greater than six. This was attention-grabbing to us, as a result of there was no restrict to the variety of advertisements a person may skip. But his behavioral information would recommend that he was hitting some type of cap. However, he by no means talked about this in his diary research entries. Again, we confronted one other information discrepancy!
The information mentioned one factor—that he was experiencing a restrict. But the person mentioned one other – that the function was working wonderful for him. Through simultaneous triangulation, we have been in a position to comply with up with him to know his psychological mannequin and thus the explanation behind this discrepancy. He responded, “I can only skip six ads because I only get six song skips. I guessed ad skips must follow the same rule.” On Spotify Free, a person can solely skip six songs per hour. He knew about this rule and misattributed it to advertisements as effectively. He made his personal psychological mannequin about an advert skip restrict.
This time, discrepancies weren’t a complicated lifeless finish, however somewhat, attention-grabbing new alternatives to dig deeper. Through concurrently combining the behavioral information dashboard and the attitudinal diary research, we discovered nuanced drivers and blockers for advert skipping.
Verified insights to gas product selections
As a end result, User Research recognized totally different psychological fashions of consciousness, understanding, and value of this new function. However, the diary research methodology isn’t geared up to quantify how many individuals outdoors of the diary research could be affected by these points.
So, we collaborated to outline proxies in behavioral information that may measurement the impression of every perception. For instance, we grouped those that had confusion round an “ad skip limit” by figuring out customers who at all times plateaued at quite a lot of advert skips per day.
With product and design stakeholders, we brainstormed options to deal with every subject uncovered within the analysis. We determined to ship instructional messages to teams of listeners who displayed these proxy metrics for consciousness, understanding, and value points. For listeners who have been confused about an “ad skip limit,” we may ship them a message informing them that there was not truly a restrict on the variety of advert skips.
These focused instructional messages decreased person confusion and, in consequence, we noticed our function success metrics double. In mixing strategies, we are able to have better confidence in our insights and guarantee product selections are evidence-based.
Wrapping all of it up
Simultaneous triangulation is an extremely highly effective device to generate complete and verified findings. If you solely use one methodology, you would find yourself with blind spots. If you use strategies sequentially somewhat than concurrently, you would run into unexplainable contradictions…like we did at first.
The answer is simultaneous triangulation. Next time you may have a posh analysis query, think about using the three-step course of to mitigate blind spots and switch discrepancies in studying alternatives. Choose strategies from totally different quadrants of the “What-Why Framework” to offer complementary views. Apply these strategies on the similar time, to the identical group of customers.
You can apply this technique to your insights observe: simply comply with the steps with any of the analysis strategies you may have entry to. You don’t must be on the Product Insights workforce at Spotify, and also you don’t even want each User Researchers and Data Scientists. You can do simultaneous triangulation with the strategies in your toolkit.
With many analysis questions forward, we’re excited to proceed using simultaneous triangulation to affect advanced product, design, and tech technique selections. We acquired right here by experimenting with mixing totally different strategies and we encourage you to do the identical. Tell us how this course of works for you, as you apply this in your analysis! You can attain out to us at colettek@spotify.com and ksavage@spotify.com.
—
Thank you to George Murphy who led this analysis mission and Peter Gilks for presenting this case research with us on the Qualtrics Conference earlier this 12 months.
—
References:
[1] When to Use Which User-Experience Research Methods, by Christian Rohrer.