
Inside UXR
Explore the practice of user experience research with Drew and Joe, one question at a time.
Send your questions and feedback to insideUXR@gmail.com
Inside UXR
21. How do I manage my personal bias in my study?
In this episode of Inside UXR, Joe and Drew tackle the challenging topic of managing personal bias in user research. They dive into practical strategies for identifying and minimizing bias, even when working on products they use themselves. With relatable insights and real-world examples, they explore how to develop testable hypotheses and embrace unexpected findings. Perfect for researchers looking to level up their objectivity and refine their approach!
Send your questions to InsideUXR@gmail.com
Visit us on LinkedIn, or our website, at www.insideUXR.com
Credits:
Art by Kamran Hanif
Theme music by Nearbysound
Voiceover by Anna V
21. How do I manage my personal bias in my study?
Joe Marcantano: Drew, good morning.
Drew Freeman: Morning again, Joe.
Joe Marcantano: We are recording our, second episode of the day and I am a little excited about this question. This is something that I have dealt with personally, as I've kind of matured as a researcher and I'm sure you have as well.
Drew Freeman: Only like every single project.
Joe Marcantano: Awesome. So I'm going to dive right in here. This question was submitted anonymously and the listener writes, how do you remove your personal biases about a design? When running sessions, sometimes during kickoffs, when stakeholders are showing me their intended flow, I can spot the red flags and I get stuck on them because I make assumptions that participants will as well. I personally have a design background, so I'm prone to nitpicking. But what can I do to be more open minded about this?
Drew Freeman: I love this question so much, partially because you, as the anonymous question asker are, recognizing something that you might be doing that is introducing bias to your research. But I also want to. I also love this question because you're recognizing things that might be problems. And that's great. Like, that's great for your research and for your study. The key is just how to find that happy middle ground which Joe and I can talk about.
Joe Marcantano: There was a part of me that said this is going to be the shortest episode ever. The answer is, you know about your biases. You're all set. Just make sure you don't fall for them and you're good to go. You know, there are so many researchers out there who don't even realize that they have these biases or that they can't remove their own personal use cases. They kind of cannot separate themselves from the user and their research ends up, you know, for better, lack of a better word, tainted.
Drew Freeman: Agreed. 100% agreed. Okay, so where do you want to start with this? Because I have so many thoughts and I don't know, like, with the best way to dive in.
Joe Marcantano: I do like that this, this person called out that they have a design background as well. I think that we should start broader, you know, not counting kind of the design background and just picking up, kind of bad UX design or bad design practices in there.
Drew Freeman: I was going to call that out too, because I don't have any sort of design background, but simply by doing UX and being a person who thinks about UX and has learned a little bit about what good design looks like just kind of by osmosis, I can pick up on these things too.
Joe Marcantano: Yeah. But I think we should start even broader. Let's, you know, think of a product as ubiquitous as email. Everybody uses email, right? It can be difficult if you are working on an email product. It can be difficult to think about new flows, new features, how people might keep their inboxes, something as simple as keeping your inbox organized. If you are, say, an inbox zero person, but you are testing a product or a feature that's geared towards helping people keep their inbox more organized or mass archive or mass delete or whatever, it can be really difficult for somebody to look at the product and go, but why would I even need this to exist? My inbox is always at, 0 or 1 or 2.
Drew Freeman: That is where you as the researcher need to remember a universal truth, which is that you are not everyone and lots of people think differently than you. If you remember nothing else but that, you will be going such a long way in terms of minimizing your own biases as a potential user.
Joe Marcantano: Agreed. And you know, I have, with some degree of frequency, worked on products that I use in my personal life and kind of had to check those preconceived, like, this is how I do it, this is how I would do it, notions by throwing my study plan past a colleague, throwing. Throwing my flow past somebody else who maybe uses whatever the product is differently and just saying, hey, does all this kind of flow in an intuitive way?
Drew Freeman: Am.
Joe Marcantano: I check that I'm not using any leading questions here. Ensure that the way
00:05:00
Joe Marcantano: I'm conducting this research is going to be as unbiased as possible.
Drew Freeman: I am a huge believer in having someone else review my work. No matter how experienced I am, no matter how confident I am, it is always valuable to have someone else review your work.
Joe Marcantano: Yeah, there's. I can't think of a scenario where an extra set of eyes is a bad thing.
Drew Freeman: Mm, So I do want to go back to the first line in our listener's question, which is, how do you remove your personal biases about a design? And I want to not push back on it, but I do want to kind of put the idea in your brain that it's okay to have personal biases about a design. It's okay to have the thought of, wow, this is going to be something that users harp on and that users hate. Wow, this is bad design practice. That's okay. You know what those become. Those become hypotheses that we can then test as part of the user research. It becomes a. My hypothesis is that users are going to struggle to find X button. Let me put a, let me write a task that puts them in the situation where they need to find X button and see if I'm right. Because a lot of the time, at least in my experience, the things that I have come in with as hypotheses that I felt very strongly about, the most fun is when those. When participants tell me that I am 180 degrees wrong and that, my hypothesis is completely off.
Joe Marcantano: I was just thinking something along the same lines, you know, like, there's, you know, if you ever hear scientists talk about, you know, when they do.
Drew Freeman: Testing, I was just gonna. I was gonna bring up the scientific method. Scientists don't do science and don't do research with nothing like not thinking they know what the results are gonna be. They do it thinking this is what's going to happen.
Joe Marcantano: But one of the things that I want to talk about is, like, you know, when they have a hypothesis and it's wrong, most scientists get exciting, excited.
Drew Freeman: Exactly.
Joe Marcantano: That's a new finding. And, that can be a little counterintuitive. And frankly, I was a little skeptical of that mindset as well. Until, you know, after the first few times where I developed a hypothesis while looking at a model or a mock, and when it was wrong, I found myself excited. I found myself saying, I've discovered something new. This was not what we thought. This is cool. And that's one of the greatest feelings.
Drew Freeman: I just had an example of this in the last research project that I worked on where we asked a question about, like, hey, we're thinking about multiple different names for this thing. How important is the name going to be for you? I was thinking, like, people are not going to care what the name of this feature is. Like, whatever. They don't care. They care what it does. Turns out I was 100% wrong. And people felt very strongly, that people expected the name to have large importance to them. That's cool. Like, that means, oh, wow, okay. This thing that I like. This door that I was completely shutting. Nope. Turns out I need to open that back up, and there's a whole nother room to investigate.
Joe Marcantano: And I think that this is a good point to kind of reiterate something you already hammered on. You know, when you have your hypothesis and you're testing it or you're designing the test for it, it is a incredibly wise idea to throw that by another person and even tell that person, hey, Drew, take a look at this. My hypothesis is this. Let them, see behind the curtain a little bit and say, do you think that I'm leading too much here? Am I giving away too much? Are you able to figure out what the hypothesis is or specifically what I'm doing based on how I'm testing this?
Drew Freeman: Listeners can't see me, but I'm nodding very strongly. Yeah, like Joe said at the very beginning. Anonymous question asker. You're already. You've already done kind of the hardest part here, which is identifying that you have potential bias. That is. That is the hardest part that a lot of people can't get past. Once you've identified something as a potential bias, it becomes much easier to do the things that you need to do to minimize that bias. And that's another thing to remember, is that we are never going to 100% remove or eliminate all bias. It's just not possible. We're humans. Bias is kind of what humans do consciously or unconsciously. The key is to minimize it as much as possible. Okay, so in practice, what does that look like for me? I'm reading through your question. When you're going through the kickoffs, you can see and stakeholders are showing you the intended path. They're showing you what they expect
00:10:00
Drew Freeman: users to do in their product, you're spotting red flags. What I do is make a note of that. Whether it's a mental note or an actual handwritten note, I make a note of, ooh, I think this is going to be a problem. I don't think users are going to be able to complete this task, that sort of thing. And then those become my hypotheses, which I am now then putting myself into the mind of a scientist using the scientific method to try and test and prove or disprove.
Joe Marcantano: Drew, what do you think about. Because I've kind of waffled back and forth as I've been thinking about this. What do you think about specifically calling those things out to the stakeholders during the kickoff? And I don't mean in a confrontational way saying, hey, this user flow sucks. I mean, saying something like, hey, have you thought about. Or what are your thoughts about people potentially getting stuck here? Or have you thought about the risk of this? What are your thoughts around bringing that up in a kickoff as you're being shown the model or the mock?
Drew Freeman: I might not do it in the kickoff, but I would certainly do it as I'm working on the discussion guide and going back and forth with my stakeholders. And you're right, you don't say, hey, this, this product, what you've designed here, it's not going to work. But you can ask in the ways of, is, there anything that you're concerned about in this, in this design? Are there, are there hypotheses that I should know about going into it? Because those hypotheses are then something that I need to make sure that I'm addressing when I'm doing analysis and the reporting. what risks are you most concerned about? All those, that kind of terminology that you just used, Joe? That's a really good terminology that I would recommend asking.
Joe Marcantano: One of the things I was thinking about is, you know, how often our own assumptions are wrong.
Drew Freeman: All the time.
Joe Marcantano: All the time. And so maybe when I'm shown something, I have an assumption that something is bad or it's ill informed or it's a guess. But there could be other research I don't know about. There could be previous experience that this designer has or this product manager has that I'm not privy to. And so maybe what to me looks like a guess or an ill informed hunch is actually something that's well thought out, researched, and maybe learned through experience.
Drew Freeman: The hard way, 100%. I think for me, the biggest thing comes back to I am but one person, and my experience is not everyone's experience. And maybe this is just, telling about how my brain works. But I tell myself, like, no, I'm a little bit weird. I'm not average, I'm not standard, I'm not typical. So maybe that's just because of the way that my brain is wired. Maybe it's just a little bit easier for me to kind of remember that my experience is not going to be the same as everyone else's.
Joe Marcantano: You and I have talked about how neither of us really have a strong design background other than kind of what we've learned through osmosis, you know, working so close with design. But I think that a related instance is, you know, working on products that we use in our personal lives. There have been plenty of times where I have had a strongly held belief about a product that I use and I'm testing a feature or an addition or a change that will drastically change how I use that product. It is extremely difficult at that point to not say, I want to prove this wrong because I want to keep the product, you know, I want to keep the way I Use it viable. I want to continue to be able to do X with the product. That can be really, really tough then to say, all right, how do I conduct unbiased research here?
Drew Freeman: And the answer which you mentioned is really difficult, is remembering that this product is not for me. This product is for lots of people, some of whom might be like me, but many of whom might not be like me. And simply remembering that will go a long way in helping you minimize that instinct and that urge.
Joe Marcantano: I think one thing that's helped me is just getting a little experience working with products that I use. the first times are really difficult because it's, it challenges the way I view the product. It challenges my mental model. But as you do it, as you continue to do it, what I've kind of discovered is my view changes. It can be informed. You know,
00:15:00
Joe Marcantano: one of the coolest things in the world is, when I see somebody doing something with a product that I use that I hadn't done before and I go, whoa, that's really cool. I want to try that. And just the fact that I'm researching a product I use changes the way I use the product. And when that happens more than once, you start to kind of naturally have this more open minded approach. Ah, to. Okay, like I'm going to test this. I know that even though I am a user, in this case, I'm not the user. And I might get to pick up something cool that I hadn't seen before or use it in a way that I hadn't thought of before.
Drew Freeman: I love that way of thinking. I love that idea. And I think that that's something that maybe this is just kind of inherent to me, but especially in a professional sense when I have what I think is a really cool idea, once I've fleshed it out to a degree that I'm happy with, the first thing I do is go to colleagues that I really trust and think highly of and ask them tell me why this won't work. Rip this apart, tell me what I'm missing. What is stupid about this? That's the first thing I do when I've got an idea that I'm excited about.
Joe Marcantano: Same. Yeah. I am always taking my ideas and saying, hey, poke holes in this for me. Find out how this doesn't work.
Drew Freeman: Okay, so I'm going to take us on a little bit of a tangent here. You, anonymous question asker have just you're going through a kickoff and you've identified something that is bad design practice. I think it is okay to have the conversation in your own head. Is there a design best practice that this is breaking? You know, is there a heuristic that this is breaking? And if there is, I think it's okay to say, hey, I'm concerned that this is not following X, Y and Z heuristic. Like, I'm concerned that this is not meeting the flexibility and ease of use heuristic, or this is not meeting the, You know, there's no good way to recover from an error here and simply ask your stakeholder, what conversations have you had around this area? Is that something that you've thought about? Because I know of situations where a researcher has done that. And then even before the research goes live to participants, designers say, oh, you're right, let's make a change. Because I don't want to be like that. Yes, you're right. That is 100% a problem. I don't want to be getting feedback on just that. Like, we already know that's a problem with an n of 1. Why do research on that? I've also seen the same thing happen with dry run participants. So an N, you know, n equals 0, essentially P0 participant, where they're like, oh, we didn't think of that. That's a problem. Let's press pause on actually going to our participants and let us make a quick design change.
Joe Marcantano: There's a difference between something that is kind of objectively, you know, using air quotes, wrong with the design versus subjectively wrong. And what I mean by that is the easiest one that comes to mind is like, when it comes to accessibility and color contrast, right? You can run that through a, color contrast calculator. There are tons of them online. And say, objectively, these two colors are too close together, people will get confused. But then there's things that are, that are more like subjectively wrong. Is this button as discoverable as we think it is? Is this call to action as clear as we think it is? You know, those are the things that, that you need participants for. But if you're discovering something when they're walking through and the color contrast is not right, or, you know, the font is too small or whatever, you know, something that's just objectively wrong. I think those are the things that you can bring up before even P1 and say, hey, looks like, have you guys run this through a color contrast calculator? Because I'm seeing a couple that might look a little too close together as far as the shades of the colors go.
Drew Freeman: And I think even in those cases, where it's more of a subjective thing. I think it's okay to bring it up and say, hey, I have potential concerns about this. Or even better, has anyone brought this potential concern up before me? I. And the reason that I love asking that question is I want to know if I'm pushing on a closed or an open door when it comes to analysis and the readout. Like,
00:20:00
Drew Freeman: I want to know how much proof do I need to come armed with if this is a conversation where people are like, yeah, we think that this is going to be a problem, but we need participants to tell us, okay, that's a different level of proof than, oh no, we've had this conversation and people are dead set on doing it this way and they think they're right. Okay, now that is a higher level of proof that I need to bring if participants are going to. If my hypothesis is going to be proven correct.
Joe Marcantano: So I think the high level wrap up here is, as we talked about in the beginning, just knowing that you're.
Drew Freeman: Doing this more than half the battle.
Joe Marcantano: Yeah, for sure. Calling things out in the kickoff or immediately after in a non confrontational way.
Drew Freeman: This is where your soft skills are and communication skills are going to be valuable.
Joe Marcantano: Yeah, totally acceptable. But the other thing is that these, these assumptions that you have can form your hypotheses and can form, inform how you test. And you know, as long as you are flexible enough and open minded enough to remember that, you know, you're the N of one here and that it may be that you think X is going to be a problem and it turns out Y is a problem. And X never even enters into people's calculus. Like just kind of be flexible enough and adaptable enough to abandon your hypothesis if it's immediately proved wrong or something else proves to be a more important factor.
Drew Freeman: Yeah, I think the thing that I would say is use those hypotheses to help you drive the testing that you're doing and the research that you're doing. Make sure that you are, like Joe said, keeping an open mind and not leading people to agree with your hypothesis. The best way to do that is to have other researchers review your work.
Joe Marcantano: Awesome. I think we've covered this one, pretty well here, Drew.
Drew Freeman: Yeah, this was a fun one.
Joe Marcantano: Yeah, thanks Anonymous for sending that in. This was a great question and thank you to all our listeners for checking out this episode. If you enjoy this episode, please give us a Like subscribe wherever you get your podcasts. If you've got a question you want to hear us talk about. Send that question into insideuxr@gmail.com. there's a link in the show notes if you'd like to support the show. And with that, I'm Joe Marcantano.
Drew Freeman: I'm Drew Freeman, and we'll see you next time.
00:22:30