Inside UXR

22. How do I measure the ROI of Research?

Drew Freeman and Joe Marcantano Episode 22

In this episode of Inside UXR, Drew and Joe tackle one of the most challenging questions in user research: how to calculate ROI. They explore why it’s so difficult to assign a dollar value to research, whether you’re in-house or at an agency, and share strategies to demonstrate the impact of your work. From discussing metrics like NPS and churn to navigating NDAs and counterfactuals, they offer practical insights for researchers at all levels. Tune in to learn how to showcase the value of UXR—even when the numbers feel elusive!

Send us a text

Support the show

Send your questions to InsideUXR@gmail.com

Visit us on LinkedIn, or our website, at www.insideUXR.com

Credits:
Art by Kamran Hanif
Theme music by Nearbysound
Voiceover by Anna V

22. How do I measure the ROI of Research?

Drew Freeman: Morning, Joe. How you doing?

Joe Marcantano: I am well, Drew. I'm enjoying my, Sunday morning coffee. How are you?

Drew Freeman: I just finished with a nice big breakfast. You know, the kind that you only make on the weekend. So things are looking up.

Joe Marcantano: There we go. I'm presuming you're getting ready after this to watch some football.

Drew Freeman: It's football season, so yes, that's what I do.

Joe Marcantano: Awesome. Well, I understand that we have some listener submitted questions this time.

Drew Freeman: Yeah, we've got an episode that's full of listener questions, which is really exciting. So let's dive into the first one. And this first one comes from senior user researcher Patricia Nil. thank you so much for sending in this question. And Patricia asks longtime listener, first time caller. I was curious to know how y'all calculate the roi, which is return on investment, of uxr, especially being at an agency. This is a topic of focus within the UX research team at my company of late, and I figured you two might have some thoughts in the area.

Joe Marcantano: This is. So maybe we should break down for folks who are in house why this can be so difficult for folks who work in agencies.

Drew Freeman: I mean, honestly, I think calculating the roi, again, return on investment, which really return on investment is simply just, I paid a dollar for this thing, how much benefit am I getting out of it? And ideally that should be more than a dollar. but really, I think calculating the roi, especially for user research, is just difficult no matter whether you're in house or at an agency.

Joe Marcantano: I would push back gently on that. I think that granted, it depends. Right. But if you are working on an existing project or an existing product and doing research on it, it can be a little easier to calculate the return because you can look at things like conversion rates or churn or NPs and see an improvement. Hopefully after your UX research and whatever changes you deploy, you may not have access to those kinds of metrics as an agency, partnering with somebody.

Drew Freeman: Oh, I'm totally in agreement with you that it can be more difficult at an agency. Absolutely. I'm just saying it's not easy as an in house researcher as well.

Joe Marcantano: No, and I think that probably one of the big reasons is we never get to investigate the counterfactual.

Drew Freeman: You know, that's exactly it.

Joe Marcantano: Yeah. Once you do the research, you can't unnow or you can't know what the path of no research looks like.

Drew Freeman: Well, you could, but it would be bad practice. Like it would be malpractice to do so.

Joe Marcantano: Yeah. So you, I mentioned like NPs and churn and conversion, rates. Are there anything else you can think of, Drew, that would like, that could be something you could use to measure.

Drew Freeman: The impact of research as an in house researcher. Those kinds of metrics, those KPIs, those key performance indicators, those are the first things I'm looking at. So you mentioned NPs, churn, click through, rate, that kind of thing. I would also be thinking about, you know, whatever your company's KPIs are, that might be, the amount purchased. That might be how many people are clicking on your call to action. That might be a lot of things. frankly it might even be something like simply a SUUS score, a system usability score, or it might be your task completion rates. Like that can be, the metric that you're looking at can vary a lot, but those metrics are the first place that I would look.

Joe Marcantano: I'm totally in agreement there. The really tricky part, I think is when you're looking at a new product or a new feature or a new launch and you don't have the counterfactual and you don't have a baseline. So it becomes really difficult at that point to say, well, how much did the research actually work? How much did it actually improve?

Drew Freeman: That is, yes, absolutely, much more difficult. Honestly, I'm not someone who, and I know I should do this more, but I am honestly not someone who focuses

00:05:00

Drew Freeman: a ton on proving the worth of UXR in like a easily calculatable, showable, provable kind of way. I'm more of a. I'm gonna let my work speak for itself and let my stakeholders and let my clients decide how much that's worth to them. Like I said, I probably should do more, spend more time thinking about how to prove my roi. But that's just not something that I do a ton of.

Joe Marcantano: Yeah, my natural inclination is to fall that way. And it's definitely been an effort, you know, a real struggle to kind of force myself to start to think that way. One of the ways that I think that I've had a little bit of success is I try when I'm coming up with the study plan or when someone on my team'cutting me up in the studying plan. We think about metrics for Success, whether that be task completion, understandability, discoverability, whatever it is we're testing. And sometimes when you test and you study, things look great and you can move forward with whatever your prototype or your feature is without any changes. And that's awesome. But there are a lot of times where you're going to recommend a change. You know, discoverability might be low, understandability of whatever feature might be low. The. The user flows may just not successfully be completed. And I think there, it becomes a little easier because then you can say, all right, in the first iteration of the project or the product, we saw 60% completion rate and based on the research design, made some changes, and now we're up at, you know, 96% or whatever, totally. that becomes a little easier when you're kind of doing that iterative research to measure the actual changes.

Drew Freeman: And we should be doing iterative research anyway. Like, that's just the best way to do research and design changes. So the other thing that I think about, which is not exactly roi, but does show the worth of, of user research, is making sure that I am giving my stakeholders what they need to move forward. So is that a validation that the design changes that they've made are headed in the right direction? Is that, information on concept A versus B to help them move forward with concepts? Basically, it comes back down to in the study design phase, asking, what decisions do you need to make from this research and what information do you need to be able to make those decisions? Because that way I'm showing my worth as a researcher by helping them move forward.

Joe Marcantano: Yeah. Have you provided enough information to inform the decision so that they can make it confidently?

Drew Freeman: And like I said, that is a little bit more squishy than ROI typically is in terms of being able to put an actual dollar amount on it. But that is, you know, there's so much about user research is a little bit squishy.

Joe Marcantano: Agreed. The thing that complicates this, and my suspicion is this is what Patricia is facing, is that when you're working in an agency world, there are two additional factors that make it incredibly difficult to measure roi. The first is you likely are not privy to all of that information either because the client doesn't want to share it, doesn't think you need to know it. You may not know, for example, what the MPs was before and after. They just may decline to share that with you. that happens quite frequently. sometimes when I'm doing research, it feels like I'm just kind of throwing it into the void. And then six months later a product gets released publicly and I get to see how much of my recommendations and my research were actually used to inform decisions about the product.

Drew Freeman: Right.

Joe Marcantano: The other thing that makes ROI really difficult is NDAs, which are extremely common in the agency world.

Drew Freeman: I was gonna say, I don't know that I've ever worked on a project that didn't have one.

Joe Marcantano: No, me neither. And so you know, my presumption here, and Patricia, I hope I'm not too far off, is that the reason you're wanting to calculate ROI is so that you're marketing in your sales folks and, and your public outreach and your social media folks can say we provide this return and this is why you should hire us. That's really hard when you can't always talk about findings and outcomes.

00:10:00

Joe Marcantano: I'm staring at a pair of headphones on my desk. So let's say I've got, I did work for JBL and I worked on their headphones and now they made a change and it resulted in a successful product launch. Well, I can't really talk about the exact things I found in the findings. I might not even be able to mention the client by name, so I might have to be vague. I might have to say something like I worked with an audio technology company and the work that I did discovered a number of usability issues and those changes were made and it led to a sector leading product launch that might be as specific as I'm able to get depending on the specifics of your NDA.

Drew Freeman: Well, and honestly going back to the first issue, even being able to say those kinds of things is sometimes more than you're able to say as an agency researcher simply because, like you said, I don't even know what changes they made based on my research.

Joe Marcantano: Yeah, your vision of the outcomes could be six months or more lagged behind.

Drew Freeman: And let you know, let's not even get into the projects where the end outcome or the recommendation that I made was hey, I don't think there's a market here or hey, I don't think this product is going to meet the needs that you think it will. And so nothing happens. They don't release anything. So the return that I gave them was they didn't waste time on a product that wasn't go going toa sell.

Joe Marcantano: I mean at that point you prevented a negative return on the money they would have dumped in.

Drew Freeman: But again that goes back to just the counterfactual of, well, I don't know how much that is, you know like it just comes down to the actual. Putting a dollar amount on it is really, really hard. Okay. So I think of, I think it's maybe worth us going back summarizing or just diving a little bit deeper on kind of four, four different scenarios. So I'm thinking of two choices. One, you're in house or you're at an agency. And then if. Is there like, is this a pre existing product where we've got a baseline or is this a new product where there is no baseline to work off of? And how would I go about trying to prove the value of research? Does that make sense?

Joe Marcantano: Yeah, it does. I think that's a good way to break it down.

Drew Freeman: Okay, so let's start in house, frankly, because I just think that's a little bit easier in house when there is an existing product that we're doing research on, whether it's a new update, you know, whatever. The biggest thing that I go back to, and this comes from my time working in house at a software development company, I forget exactly what this is called, but essentially the concept is the earlier you find a design deficiency or a bug or whatever it may be, the earlier in the development process you find that the less expensive, both in terms of money and time and resources it is to fix that deficiency or fix that bug. So if I find something in the design stage before any code has been written, that's much, much less expensive to fix than if I find it in the testing phase when code has already been written. The biggest gap is between pre launch and post launch. Post launch is exponentially more expensive to fix something once it's already been out into the real world and in the hands of users. So that, that was always a big one for me that I always went to when I was talking with higher ups who were maybe a little bit unsure of whether user research was worth it for a particular project.

Joe Marcantano: And one thing to point out is that formula you just talked through doesn't even account for users who churn because you launched something that wasn't ready. Or users who became frustrated and their, their viewpoint of your company or your product is now diminished. That is.

Drew Freeman: Yeah, that doesn't account time and money.

Joe Marcantano: For the developer to fix it.

Drew Freeman: And that developer time, even that is much less expensive than a user who gets so frustrated that they go to a competitor. Exactly. Or a user who before was someone who would recommend the products to friends or, you know, whatever. And now because of this deficiency, they're just a user, but they don't recommend it anymore.

Joe Marcantano: Agreed. I Think that when you're doing research on an existing product in your in house, these metrics and goals need to be thought about right at the beginning, right when you're formulating your study plan.

00:15:00

Joe Marcantano: And maybe your questions aren't explicitly referencing it, your research questions, but it should be somewhere in the background paragraph of your study plan that we are launching this product because we want to improve NP'or whatever the KPI is you're tracking.

Drew Freeman: Or you know, a phrase, terminology, that we used a lot at my software development job was we're making this change because we want to help the user better do X and we're going to measure that by whatever metric. So I honestly I think of the metrics as like, yes, partially they need to be decided during the study design phase, but they also kind of need to live at a level above the individual studies and almost like a program level. So like every single study we measure X, Y and Z metrics. And so we always just kind of have that baseline or that understanding of what these really core metrics are.

Joe Marcantano: So what about for a new product still in house but a new product? This to me is where it becomes a little fuzzier. How are you measuring ROI here?

Drew Freeman: So I think for me again that the concept of the earlier you find a deficiency, the cheaper it is to fix. That's really key here. But I also then think about kind of what decisions do we need to make as a product team and how can I as a researcher best give us the information to do that? So this is where I think about the return is that we were able to make a decision to go with concept A over concept B because the research told us to, you know, based on X, Y and Z findings. That's the kind of thing that I think about when there isn't a product that already exists.

Joe Marcantano: The, you know, we talked about how you never know the counterfactual because it didn't happen. Right. But you know, you can use your research as a proxy for that. You can say, well we did the A B testing and we found that the B version that we didn't go with was only successful 60% of the time and the A version was successful 90% of the time. And that's why we went with a Y. You could I think make a pretty good argument for we prevented a 30% lower return on this product or our research informed decisions that led to a 90 plus percent retention versus 60% which was what the other model showed. You can use your research as a window into the counterfactual. It's not going to be an exact, you know, because you're using smaller sample size, it won't be an exact representation. But I think it gives you a good starting point for forming the argument for the return on investment of the research.

Drew Freeman: I agree. And a lot of what we've been talking about is kind of predicated on doing primarily qualitative research. If you're doing quantitative research, some of this gets a little bit easier. In fact, there are quantitative methods where you're basically just measuring how big would our market share be given Products with X, Y and Z features that we are planning for. Like that's, that is basically just measuring roi. Because you can say, all right, for this feature set, this would be our market share. For this feature set, this would be our market share. There you go, that difference. That's your ROI for the research.

Joe Marcantano: Yeah. The qualitative side is always a little bit fuzzier and takes a little bit more of, you know, putting on the Sherlock Holmes cap and finding the specific nuances. and that extends to, you know, the value of the research, I think.

Drew Freeman: But I think biggest, your biggest, advantage when you're in house and trying to come up with this is that you have really good access and hopefully good relationships with your higher up stakeholders. And so you can go to them and have really honest conversations about how can I show value to you, how can I prove my worth and let them tell you what they need to see.

Joe Marcantano: And you can even go back to earlier versions and you can say, hey, this was the first version and this was the third version and the UX research informed this change, this change and this change. And then maybe post launch, once you get a good idea of which features are most popular, which things people most like, you can start making some educated guesses and saying people really

00:20:00

Joe Marcantano: love the blue version of this product. And it was because of the UX research that we released a version in blue at all and it's a great seller now. So that entire line is thanks to the UX research.

Drew Freeman: Agreed. Okay. I think that's a pretty good place to leave it for in house. Anything else that you want to add for in house?

Joe Marcantano: No, I agree.

Drew Freeman: Okay, so then let's move to agency, which I think we both agree is a little bit more challenging.

Joe Marcantano: It is. I think a big chunk of this just kind of depends on your relationship with your stakeholders, or your clients. You know, if you have an ongoing relationship, they're probablyn toa share a little more with you. You've built a little trust, you've built some relationship with them. But if this is a new client, you're probably going to get the bare bones information you need to run the study.

Drew Freeman: Even in the case when it's a new client, I think it's still 100% appropriate and 100% responsible to ask what do you need from me to make this research project a success and valuable and then kind of let that be your guiding, your guiding star. That's not going to make all the decisions for you, of course, but that really does give you the best window into how can I provide the best return for this client.

Joe Marcantano: Yeah, I think with clients it's going to be a lot harder to provide a dollar amount to say that our research that they paid X number of dollars for resulted in 5x return, whatever numbers. Right. Like you're probably not going to get that information. This is going to be much more of the like qualitative. I provided what they needed to make the decisions they needed to make and that led to the launch of X or Y or whatever.

Drew Freeman: Yeah, agreed. So I think it's easier for us to talk about in terms of what you, what you should be looking at and thinking about, but harder to do in practice.

Joe Marcantano: The only other part to that is, you know, when you have public facing products, you can't violate your NDA. You can't say we did the work that discovered this. But you can certainly say if your client is okay with disclosing that you worked with them, hey, this was a version of their product before our research. This was a version after. I can't talk about specifics of what the finding was, but you and I as responsible adults can sit here and look and go, yeah, the after is better. and you're not going toa get that strict numerical return on an investment there. It falls into the fuzzier category but I think that that's definitely something you can do if the client is okay with this closing that you worked with them.

Drew Freeman: M h, yeah, I agree. Again, I think for me as thinking about it as an agency researcher who full transparency is not super involved in the sales and marketing side of things is that I'm focusing on what decisions did my research allow my client to make. And like Joe has been Talking about with NDAs, I almost never am able to say I was able to help them make decision A, you know, the exact decision. But it's more along the lines of I was able to help them make decisions on which concepts to move forward with or, you know, help them make decisions about which features to prioritize, that sort of thing.

Joe Marcantano: Drew, do you think that there's a huge difference in measuring the ROI when you're at an agency working on a new product versus an existing product?

Drew Freeman: Most of the time, no. I think the exception to that is if you have a very deep, long standing relationship with a client and you're almost pseudo in house at that point. So then you can kind of get closer to a lot of the things that we were discussing in the in house section. What are your thoughts?

Joe Marcantano: Yeah, I think that, you know, when you're working on an existing product, especially if it's public facing, you can get a little bit of that before and after. But largely I agree it' it's incredibly difficult. Even our best clients that you and I work with are unlikely to share specific sales figures, NPs scores. know we might get told, hey, churn is higher here than we want it to be, but I don't think that we're going to get this specific numbers all the time, if ever.

Drew Freeman: I have gotten those numbers, but often those are, you know, explicitly included in an NDA, so I couldn't use them in sharing the ROI anyway.

Joe Marcantano: Yeah. Anything else you want to cover on this question?

Drew Freeman: I think we've covered most

00:25:00

Drew Freeman: of it, I think for me, and I've said this multiple times throughout this episode already, but it really is. What I think of as the most important piece here is coming back to your stakeholders having an honest conversation about what do you need to see in order to make this research project worth its while valuable? What information do I need to give you to help prove the worth of research? Because at the end of the day, stakeholders want the work that we're doing to be valuable. They're going to tell you like, this is what I need to see, this is what I want to accomplish. And then it's our job as researchers to try to accomplish those things while also maintaining good research practices.

Joe Marcantano: Depending on how your agency is shaped, you know, you might be an agency that's paired design and research. You might be an agency that just does research. Or if you're in house, I don't think it's the worst idea in the world to pair your return with design. because research and design are kind of the left and right hand. They really do work together. They make each other better. And so you may have some better luck measuring the ROI of UX in general rather than UXR specifically.

Drew Freeman: The idea being that you can say the return on investment for these design changes was 5x and research helped identify those design changes as valuable. It doesn't exactly say that, okay, research was 2x of the, you know, was 2x and design was 3x. But it does help you tie yourself to whatever that ROI is.

Joe Marcantano: Yeah. The only caveat I would throw out with that is be careful, like, know your leadership there. You know, if you work someplace where the the view of research is that it's the little brother of design and it's viewed as expendable or optional, then I would work hard to determine your value outside of design. But if you work in a more UX mature place that values design and research equally, then I think you're safer to do that.

Drew Freeman: I think that's a pretty good place to leave this conversation. Do you have anything else that you want to finish with Joe?

Joe Marcantano: No, that covers everything. I was hoping to hit on this one.

Drew Freeman: All right, well, thank you so much for the question, Patricia. Obviously, this is something that you and your company are thinking about a lot, and it's a good thing to be thinking about. And obviously, as Joe and I just kind of elucidated, there's no right and wrong answers here. So you got to kind of figure it out and go forward with's going work what you think is going toa work best for you. For everyone else listening today, I want to say thank you for listening and please give us a like and subscribe on your podcast platform of choice. And if you've got questions like Patricia did, please send them our way. You can do that by emailing us at insideuxr@gmail.com. if you'd like to support the show, there's a link in the show notes where you can do that. And with that, I'm Drew Freeman.

Joe Marcantano: And I'm Joe Marintano, and we'll see you next time.

00:28:22


People on this episode