#103 - Ethical Design and Respectful UX Research with Kat Zhou of Epidemic Sound
E103

#103 - Ethical Design and Respectful UX Research with Kat Zhou of Epidemic Sound

[00:00:00] Kat Zhou: I think that when you bring in that holistic approach from regulation, from us as individual actors, but also community movements, that can be very powerful. That's why I'm optimistic. It's hard to be optimistic, but I'm slightly optimistic because the stuff is happening.
[00:00:22] Erin May: This is Erin May.
[00:00:24] John-Henry Forster: I'm John Henry Forster and this is Awkward Silences.
[00:00:30] Erin: Silences. Hello, everybody. Welcome back to Awkward Silences. Today, we're here with Kat Zhou, the creator at Design Ethically. Today, we're going to be talking about ethical design and ethical research apropos on brands. Excited to jump in, hot topic, important topic. Thanks for joining us.
[00:00:56] Kat: Yes. Thank you for having me. So nice to meet you two virtually.
[00:01:00] Erin: Yes. Still the dominant way I suppose these days, but we're getting more in a real lifetime in. I've got JH here too.
[00:01:08] JH: Yes, this is a cool topic. Feels like a big one. My silly brain goes to like Billy Madison when they're like business ethics, design ethics.
[00:01:15] Erin: Was that guy Bradley whatever?
[00:01:18] JH: Yes, the last name-
[00:01:19] Erin: Whitford. Bradley Whitford. Yes, that's a good one.
[00:01:20] JH: Yes, I'm excited to unpack it. I think we haven't come at it from this angle before, so I think it'll be a fun conversation.
[00:01:26] Erin: Yes. On that note, what are we talking about when we talk about ethical design? Because we could talk about a lot of things and I think we will talk about several things, but what do you think about? What are you thinking about every day in your work when you think about ethical design?
[00:01:40] Kat: It is such a loaded term, and it covers a lot. I think it covers everything from designing genuinely helpful and empowering accessible products for people. It also means designing products that do not harm, manipulate, or deceive folks. It means everything from designing things that don't exacerbate systemic inequalities to because we all exist in this system that has a lot of unfair power differentials going on.
I think it also means that when we're looking at the practices that we're employing to design and research, that we're doing these things with utmost consideration and respect for people, especially marginalized communities. It also means designing in teams that are inclusive and safe, where you feel included and safe no matter what your background is. You're designing in a team that is actually willing to advocate for doing the right thing over the bottom line, which is oftentimes a lot harder than it might seem.
Yes, I think that fundamentally, when we talk about research too, because this is your bread and butter, doing it in a way that's not exploitative and it's a way that genuinely honors people that you're researching and is respectful.
[00:03:03] JH: Did you have any thoughts on-- because I think we try to get this into a spot that's like pretty actionable or practical advice for people. If I'm listening to this, I'm a researcher, designer, and product manager, and I'm like, "We should be more ethical in how we do our design work where I'm at," where do you even start? Is it like you should just read about this stuff? Do you just start like in your research being more inclusive in whom you talk to?
What's like a good like a foot in the doorway to start being a little bit more ethical in your approach to design?
[00:03:28] Kat: Oh, yes. I love that you mentioned reading. I think that is such a fantastic way to learn more about the world. There's so much amazing scholarship that's not necessarily design or research scholarship, but rather, literature that falls under critical race theory or disability studies, or gender studies. There's so much amazing work that's been done in those realms where you get to learn about the world from a very different angle maybe from your own.
That is such an eye-opening thing for a lot of folks that are in our industry to read with that, to engage with those difficult topics, and to understand like when we're designing in a traditional tech company, it's not necessarily representative of the world, but often we're designing for a lot of people all around the world. Being mindful of those things and just expanding your horizons in that regard and learning more about how your product can actually impact people from all over is really important.
I think beyond that, as you're doing your research and stuff, just being mindful of when you're meeting people and talking to them, whether it's virtually or in real life, how do you approach these interactions as a researcher or designer? How do you have that respect? I think that's a way that I always go back to. It's just that respect, I think because we do interact with so many different people from all walks of life.
[00:05:00] JH: Yes. Do you have any favorite resources while we're talking about books and all that sort of stuff that you recommend or give to people?
[00:05:07] Kat: I have a whole list. On designethically.com, under the toolkit section, there's a list of books and they're from all different scholars. I love the works that Ruha Benjamin has written, Race After Technology. There's also Cedric Robinson who's done a lot of amazing work on like the intersections between racial inequality and also socioeconomic inequality.
Then there's Algorithms of Oppression written by Dr. Safiya Noble, which is really fascinating stuff that makes you really consider the industry that we're in and the workings of it and how we might be able to improve it.
[00:05:45] Erin: That's great. We'll link that and link some of those resources in the writeup as well. I want to ask maybe a dumb question, but you talked about just treating people with respect, which sounds like a really basic thing, but what do you mean by that?
Like in research, you hear about empathy, right? It's like been over maybe talked so much empathy that now there's the blowback to the blowback to empathy and can you ever really be empathetic and whatever? I like this idea of respect. That seems like something that we could try to achieve. What do you think about when you think about trying to treat people with respect?
[00:06:21] Kat: Yeas. I think one thing is to think about where you stand in society as a designer coming from perhaps like an agency or a big company, and who you're designing for and who you might be researching. A lot of times, I think it's very easy for designers, or anyone in like a tech company to-- especially if they're expanding out into an emerging market or something, a so-called emerging market, to think that, "Oh, we're going to come in and we're going to fix a problem.
We're going to fix this thing that people are not doing right or people are struggling with." Perhaps they're not struggling with that, but we pinpoint this thing that we think they could be doing better. They can be living more productively, whatever that means. They can be doing things faster. We come in and we think we're going to help them. It has this patronizing attitude, I think, were come in and we're like, "We're going to help these people because we know better."
Of course, we'll do our research, but we know better. That's like the underlying mentality that isn't necessarily stated, but I think it's very much implied by the way that we work. I think that's something that we have to be mindful of, and that's where the respect comes in. It's just how do we honor these other ways of existing, of being these other epistemologies and not come in being like this very like savior complexity persona?
That's something that ties into a lot of the work that we see on decolonizing design and decolonizing research of just realizing that there's all these nuances. Like you coming into a place and doing research for the supposed benefit of the people that you're working with, you have to sometimes question that and like really think about, "Is this benefiting them or is this benefiting my paycheck or are our bottom line? What are we telling ourselves and how are we approaching them?" That's what I'm referring to when I say respect.
[00:08:25] JH: Nice. I love that framing. The way you talked about it there, like the bottom line and the business part, it feels like maybe when you're doing this stuff and this work in a business context, it almost becomes harder to be respectful or to treat people the way you maybe would in a different context. Do you have any thoughts on how you balance those things of, "I have this business goal or this objective that I'm responsible for, but I also want to be human and respect this person"?
Sometimes those things can maybe feel a little in conflict. I'm wondering if there's ways to avoid that.
[00:08:55] Kat: Yes. It's really hard. I think we've traditionally talked about it as a balancing thing. Sometimes I think there's no way to balance. Naomi Klein, who's the author of-- she's written a lot about the climate crisis. She makes a point in her book-- the title will lose me. I think like this must change or something. We'll fact check that later.
She makes a point, which is that sometimes, for example, for the environmental crisis, there is no way to reconcile the business model that we have in this capitalist system with the needs for the earth to heal and to replenish itself. In some ways, that can be the case in tech, right? We cannot design products that are addicting and there's no way to make a business case. I mean, you can make a business case for it, but there's no way to justify that and strike a balance. You're either designing addictive stuff or you're not.
In that case, what can we do? We can do what we can. We can start small. We can work larger. Just as long as you're doing something as opposed to nothing. When I tell people it's just to find an ally or an accomplice in the company and talk to them. You might be surprised that people might have the same feeling that you do about something that can be problematic. Speak up in your product team when you have the chance to talk in these meetings. Speak up in town halls. That's a great way to communicate if something's not right. Then to also abandon the urge to accept, "Oh, but it's always been done this way." So what? Do it in a new way.
It's not easy. I think that's the biggest thing is just it's really difficult. Especially if you are a newer designer or researcher, it can be very hard. It can be a burning-out situation. Sometimes if you do feel burnt out, that might be the sign to find a new place to dedicate your talents, but it's tricky.
[00:10:53] Erin: I love what you're saying, though. I do think there's this tendency in tech that's like, "We want to have our cake and eat it too. How can we turn this into a win-win? How do we do ethical design and make the user happy and our profits are not going to suffer, and everything's going to be in perfect harmony?" That's not always the case. To your point, I think a little courage goes a long way.
The people who need to exert that courage are leaders, the people who can afford to. It's not fair to put that on the shoulders of people just starting out who have to prove themselves and have to maybe fall in line a little bit more than some others do.
[00:11:30] Kat: Yes. I like that you brought that up because I feel like a lot of people, myself included, who have been working in this intersection of tech and design and ethics. We have spent so much time bending over backwards, trying to figure out how to reconcile business with ethics, or idealism with pragmatism.
It's not necessarily possible, and we've seen all this corporate social responsibility stuff and ethics washing, which a lot of it is performative and in the back, they're actually doing stuff that's counterintuitive to what they're saying. Some things just can't be justified and need to be regulated out, and that's the fact of the matter.
[00:12:09] Erin: Yes. In the long term, I do think culture catches up a little bit, and we'll see if business and capitalism catch up too, but sometimes the thought leadership of doing the right thing is a little bit ahead of the times too.
[00:12:22] JH: We'd love to unpack the pragmatism versus idealism thing a little bit more. I feel like whenever you're getting into a moral topic like ethics and stuff, that can come with a lot of idealism, but you were also describing there are things you can do that are better. Maybe they don't change the whole paradigm but move something forward in your work and you, as a team, strove to be a little bit better than you otherwise were. That still is meaningful.
You mentioned climate change earlier. Of course, regulating the people dumping chemicals in the rivers or polluting the skies and stuff is going to have a bigger impact than anything I can do as an individual, but it doesn't make individual actions I take meaningless either, like composting stuff or doing other things. It's still good things to do. How do you try to square that for teams?
It's worth doing what you can on a local level, and maintaining the optimism that you have to-- I don't even know where I'm going with this, but does that make sense?
[00:13:09] Erin: Yes, in your composting matters, JH.
[00:13:11] JH: Thank you.
[00:13:12] Kat: Yes, it does. [laughs]
[00:13:14] JH: That's all I was looking for.
[00:13:18] Kat: Yes, it does matter, and I think it's just that if you see one person doing it, your neighbor's going to want to do it, and then maybe the whole apartment is going to start doing it. That's how change happens. This idea of a spectrum of ideal versus pragmatic is something that I want to push back on a little bit, because I think that dichotomy between those two can obscure the fact that some idealistic practices are just human rights, so they're just morally right.
Designing addictive technology just shouldn't be a thing. On the flip side, more dangerously, this spectrum can obscure the fact that some pragmatic practices in our industry are just unequivocally wrong. We just shouldn't have them. It's like child labor back in the factories. It was wrong.
It's something that it comes with time and it comes with speaking up, and it comes with this very multipronged approach where people in industry, like folks like you and I who are designing, are working hand in hand with policymakers and working with folks in academia that formulate the new normal. To formulate what are the new standards. What is this new culture that we're shaping? It's happening right now, which is really cool.
I, last year, was able to speak at the Federal Trade Commission and they're starting to crack down on regulating deceptive designs. Same at the European Union Parliament, there was hearing this year that I spoke at where they're actually trying to draft law where they'll start punishing companies that have this deceptive, manipulative designs. I think that's something that is encouraging because you're seeing it actually unfold in real time.
Of course, policymaking's not perfect. There's a lot of slow bureaucracy and everything, but the ball's moving now, at least.
[00:15:22] Erin: When you talk about deceptive design, you're talking about the dark patterns? What are we talking about?
[00:15:26] Kat: Yes, it's really commonly known as dark patterns. It's a term that the industry, we're trying to phase out a bit because dark patterns traditionally relies on that metaphor of dark being bad and light being good, which is kind of problematic and racist. It's also a very nebulous term. What does dark even mean for design? We're trying to phase it out.
It's tricky because deceptive is not necessarily a perfect replacement either because there are some of those patterns that are not necessarily deceptive, but rather they're annoying or nagging or whatever. I think that the industry is still trying to find a good term for it, but the deceptive design is at least a fragment of it that we're talking about.
[00:16:12] JH: As an example, to maybe make this tangible for folks, something that maybe doesn't feel deceptive but also feels like not a good ethical design is like excessive notifications or something like that. It's straight down the middle what it is, but it's maybe not to anyone's benefit to over notify or something. Would that be it or are there other examples?
[00:16:28] Kat: That's a great example. Another example is those petty, passive-aggressive copy that you see, where it's like-
[00:16:35] Erin: I hate those so much. I know what you're talking about.
[00:16:37] JH: Oh, like, "No, I won't pay full price," or whatever those-
[00:16:38] Erin: "I don't want to save money today."
[00:16:39] Kat: Yes.
[00:16:40] JH: I fucking hate those.
[00:16:41] Erin: Those are the worst.
[00:16:41] Kat: It's so annoying, right? That's an example of that.
[00:16:45] Erin: I always say I don't want that on the site
[00:16:48] Kat: Yes.
[00:16:48] Erin: Get away from me. I don't want it.
[00:16:50] Kat: Who's job is it to write that?
[00:16:52] Erin: Obviously it's effective, because 10 years later, it's still hap-
[00:16:56] JH: That's what I was going to say, it probably works, right? Yes.
[00:16:59] Erin: I'm sure it does.
[00:17:00] Kat: Yes, exactly. That's another great example of them.
[00:17:04] JH: Yes. That is a good one. Maybe just to play this out a little, I'm a designer on an ecommerce site or something and somebody's running an A/B test and they saw that we get way more conversion lift when we phrase it that way, but you think it's not right, or you think it's wrong or just obnoxious. What do you do? Do you try to work really hard to figure out an ethical version that out-performs it, or do you have to take a line in the sand like, "We're not going to profit that way"?
I think to your point earlier about leaders versus ICs, Erin, it's not something that an individual designer's probably going to be empowered to make that kind of call. You just have to wait for the regulation to come and to be like, "You can't write a obnoxious copy." That feels like a hard one to regulate in some ways.
[00:17:42] Kat: Yes. Actually, you're spot-on. It's very tricky to regulate because a lot of these patterns, they shift, they change, especially with different kinds of technology as well. I think also for designers, I actually don't know that it is necessarily always designers that are writing these things either. It could also very much well be marketers and whatnot.
[00:18:02] Erin: Sounds like a marketer.
[00:18:03] JH: It does sound like a marketer.
[00:18:04] Kat: Right? It really could. Something that I talk about a lot is just how the chain of command or the ways of working in these tech companies where it's likely a designer or a marketer, whomever, is reporting to their boss or their team and they have certain metrics or OKRs that they're trying to fulfil, and they're grasping at anything. "What can I do to make sure that this clickthrough rate or whatever for this one button goes up so I can show that I've made an impact and get a promotion?"
That's kind of how these practices are motivated and incentivized. Not always is it a manager at a company saying, "You have to lie," or, "Put a deceptive thing here," but it's you've got to boost up engagement otherwise your job could be on the line or whatever. Then we look around and we see what practices can we use, and it just so happens that these things are unfortunately super effective, which is not great. That's why regulation has to happen and also education.
A lot of these practices that we use are taught in boot camps or schools or whatnot. The whole idea of getting users hooked, that's been taught in design schooling for a while. How can we turn away from that school of thought and then really think about how can we be respectful to our users and not be annoying. That's another thing, can we not be annoying to people and think about, we're designing for people like ourselves and for others and whatnot.
[00:19:41] JH: On the hooked thing, because I feel like you hear examples too, that are more positive and maybe they're a little paternalistic or something, I don't know how to describe best but like a classic example that comes up is with toothpaste. People weren't brushing their teeth a lot and then Colgate or whoever put mint in it, and then it had this refreshing feel and it made it more of a habit thing and a lot of people adopted it. In some ways, that seems like probably a good thing, like probably good for tooth health or hygiene or whatever. I'm not a dentist. I don't know for sure but that seems like it comes from a place of better intent than like let's get them coming to our app all the time. That even feels very fuzzy. Who gets to determine this is okay behaviour change and this is something that's pushing it in a bad direction, like that feels very fuzzy?
[00:20:23] Kat: I think consent is something that is a really key thing here, so like with the toothpaste example or any of the examples where it's like banks that-- they're not banks but it was government programs that try to bump up pension rates, nudging people to get their pension started and sort it out. It's one of those things where these people that are the intended users, they're not like being lied to or anything. They know what they're getting themselves into.
They know that they're buying toothpaste. Whereas like with more of the deceptive stuff, it's you're trying to deceive people or nudge people into paying more money, and that's not always something that we might necessarily consent to if we have like the full knowledge of what's happening. Some things it's like, "Are we acting in the user's best interest? Are we acting in the company's interest?" You have to figure that out and it's oftentimes, it can be grey but sometimes things just feel right versus feel wrong, and nudging people to--
[00:21:22] Erin: Yes. Listen to that feeling.
[00:21:23] Kat: Yes. That gut feeling, and I think it's something like-- there's probably a philosophical term out there for it that sums up that intuition that we have, but some things just don't feel right.
[00:21:35] JH: Yes. I can imagine too, within companies that this is just going to vary situation by situation. I'm thinking of an experience early in my career. I was working in an e-commerce site and role and a marketer, a junior marketer sent out an email with a very provocative subject line of clickbait through the max of like, "Check out what you did last night," or who knows, something like that.
It got a ton of great click-through and all this stuff but the CMO got wind of it and came in and was like, "We're not-- shut this down. This is not going to happen. This is a bad email. It's we're going to get spam, et cetera." There's going to be long-term effects of this that are not good for the business and we're going to say no, but at the time, we were also running all these like free campaigns where the thing was free but then you had a bunch of cross-sells and you're going to pay a lot for shipping, so the company makes money.
Is that a good thing or not? I don't know. Even within that, it was very multifaceted in that role where there was certain things I think the right decision was made. Other things, you could argue, were a little bit more deceptive. What I'm trying to get at is just complicated. I'd imagine that happens in any business. There's so many decisions being made and there's so many different factors of how you keep users engaged that some are going to probably feel clearly over the line and some are a little bit more in the middle.
[00:22:38] Kat: It's important for designers or researchers within companies to really identify the way that they're working and also to consult the existing research that's out there of what's okay and what's not okay. There's been a lot of like scholarship around doing ethical research. That's not just necessarily in the design sphere but also coming from other industries that have been doing research as well, like medicine, anything working with children and that stuff.
There's practices that we can borrow from when we think about what's okay and what's not okay.
[Commercial break starts]
[00:23:21] JH: All right. A quick awkward interruption here. It's fun to talk about user research, but you know what's really fun is doing user research and we want to help you with that.
[00:23:29] Erin: We want to help you so much that we have created a special place. It's called userinterviews.com/awkward for you to get your first three participants free.
[00:23:40] JH: We all know we should be talking to users more, so we've went ahead and removed as many barriers as possible. It's going to be easy. It's going to be quick. You're going to love it, so get over there and check it out.

[Commercial break ends]
[00:23:49] Erin: Then when you're done with that, go on over to your favorite podcasting app and leave us a review, please. Before, we talked a little bit about respecting the audience you're trying to serve. Even in the almost five years I've worked at User Interviews, you see these conversations start to shift in their maturity in terms of where we are with ethical research, with accessibility, inclusion, with does a company need user research, with all sorts of things.
It feels like we're at a stage where inclusive representation is something people's-- I think they're pretty aware of it, that's something they should do, it's an imperative but you were talking about let's not do that in a way that is condescending or presuppose is we know the best way to help a population without actually objectively researching that, so love to just dig a little bit more into inclusive representation and how to do it ethically once we've established we should do it.
How do we do it ethically across internal teams working with participants and research throughout our design process and so on?
[00:24:57] Kat: Yes. This is a great point. I'm glad you brought this up again because I think inclusive representation, there's two parts to it, there's two sides to it. One side is the side that we commonly know about. How are you work making sure that you're engaging with communities that span various racial, socioeconomic, gender differences, thinking about disability and sexual orientation, et cetera? That's a huge part.
Then I think another side to it, the other part of it, is also making sure that we're not being inclusive just so we can be inclusive but while we're still designing a product, that's very problematic. What I mean by that is I think Ruha Benjamin does amazing work talking about how, for example, surveillance tools and predictive policing tools. Sure, you can be inclusive and include black folks in your research but at the end of the day, the tool that you're designing is meant to prey on these communities.
Overall, if you zoom out, you're realizing, "Oh, wow, sure I'm being inclusive but my end goal of putting this product out there is there's a lot of harm that's embedded in that product."
[00:26:10] Erin: It feels like the first filter there maybe for a designer is choose who you work for wisely. There's only so much you can do if the product itself is-
[00:26:18] Kat: Exactly.
[00:26:19] Erin: If you have an ad-supported social media platform, maybe you're going to have problems with addiction unless, how that work changes, but I hear you a little bit, right?
[00:26:31] Kat: Yes. You can work for a diet like a tummy tea, one of those things, and be inclusive with how you market it, but at the end of the day, it's still championing a really problematic product. Just reflecting on what your product is doing and how it exists in the system in which live. Who does it help? Who does it hurt? Then doing that first check and then moving into the actual thinking about inclusivity when you're doing your research.
Thinking about how you can be empowering in your research practices as opposed to exploitative and thinking about that difference.
[00:27:11] JH: Does it ever just like translate or evolve into-- maybe it started as like performative check the box thing like let's get a more diverse group of participants for this research even though they don't maybe have the intent or well-meaning behind. Just by the exposure, does it ever like trend in a more positive direction of like, "Well, I heard a little something from that person that actually did stick with me. Maybe I still made, in the short-term, a decision I would've made anyways"?
Like opened the door a crack or it needs to be coming from a more heartfelt place from the start or better intention. Does that make any sense?
[00:27:42] Kat: If I'm understanding it correctly, is it that people can almost accidentally stumble upon-
[00:27:49] JH: Yes. Does it help change your worldview a little bit in ways, even if you didn't go into it with that intent? I'm just imagining, I'm going to do some on-location research because we're going to open a store in some different neighborhood that I'm not usually in.
You're going there as maybe a performative thing to check the box or whatever but over time, by spending some time in that neighborhood, you probably are going to pick up some things and internalize some things that are hopefully nudging you in a better direction and better well-meaning. Does that not happen? I guess I actually don't know.
[00:28:19] Kat: I think it could happen. Of course, there's also probably a chance-- depending on who it is, they could come out of that not realizing it either but I would hope that would happen. I would hope that as a researcher, a designer, you have this openness and self-awareness to be able to observe and understand and be open to things that are different ways of being, of existing, and that's something that like is a tool that, or is a skill that we hone over the course of our lifetime, so it's not something we're necessarily born with.
[00:28:51] Erin: How long have you been thinking about ethical design and research?
[00:28:56] Kat: Yes, I think for me, it's for the bulk of my career. When I came into the tech industry, I came in with this very dear-- the rose-tinted glasses thinking that we were going to change the world for the better. There's that episode in Silicon Valley where it's like they're all at this like tech conference or the TechCunch thing or whatever and everyone's like, "We're going to revolutionize the world with this yada, yada, yada app." I think I came in with that-
[00:29:26] Erin: Music compression?
[00:29:27] Kat: Right, yes, and then it just like headline after headline of how certain companies that perhaps we'd looked up to in the past were just doing horrible things and this was-- I think that was an eye-opening moment for me, even though historically you've seen that a lot with other industries as well, not just tech. It was really disappointing and I think it was also very annoying too or frustrating because we would hear countless executives saying, "I'm sorry, we need ethics." What does that even mean? What does that mean to have ethics? Of course, it was talking the talk, but how could we as designers actively intervene in our ways of working or at least change the ways that we work to better reflect on this stuff? It's really easy when you're in a product team to get super excited about what you're designing and everyone's really positive and just enthusiastic and then it's really easy to forget to consider what could go wrong.
I've noticed when I give my workshops though because I've already primed people to think about this stuff when they're actually diving into the activities, they typically think about a lot of the what could go wrong aspects way more than they would think about what benefits does this product bring. When you're in a team, you're having fun with people designing fun things, it's hard to flip that switch in your head because no one's priming you there, so you have to prime yourself.
There's a lot of different tools out there, there's a lot of scholarship out there to get you to think about that, which goes back to the rooting part of getting aware about what's going on. It's good to keep yourself informed in that way.
[00:31:22] JH: I feel like what's really funny is all those tropes of we're changing the world over, a lot of times it just is left unsaid. They don't even say for the better, it's like, "We're just changing the world." It's like, "In what direction?"
[00:31:30] Kat: Yes, we're just changing the world?
[laughter]
[00:31:33] Erin: Can't argue with that.
[00:31:35] JH: True, but it would be nice to know.
[00:31:37] Kat: Yes. [laughs]
[00:31:38] Erin: You said you started with rose-tinted glasses. What tint are they now? What are you feeling optimistic about, pessimistic about? What's the future look like?
[00:31:51] Kat: One of those things are-- in some ways, there's a lot to be pessimistic about, but ultimately I do have hope. Otherwise, there's no way you can say in this industry or any industry, you don't have to hope that there can be some change, but I do think it needs to come from regulation. I think that companies today in this system, they operate based on incentives and they're always doing these cost-benefit analyses of what can we get away with, what fines can we pay with, pay out, and be okay.
At the day, we do need to have some more stringent rulings around how companies are able to behave and how design practitioners, technologies, et cetera can act. This is something that it's not going to be easy, but there are lots of different perspectives we can bring into the room when it comes to crafting that legislation. It's starting to happen and I think it's going to just start happening more and more.
[00:32:59] Erin: I'm with you. Regulation, corporate grade people will not do the right thing because it's the right thing and corporations are people according to the Supreme Court. A question, what about the user experience of regulation? You think about these laws come down, they are long, they are hard to understand, they're hard to execute on, and they're very easy for large corporations to hire a bunch of lawyers to read and follow the letter of law.
Maybe that's not a bad outcome, effective that is good for these companies causing the most harm. They no longer, at least hopefully if the regulations are good, are able to do that. It can be very hard for smaller companies who maybe are trying to do the right thing to spend all their time working with regulations. What do you think about that trade-off?
[00:33:55] Kat: It's such a good point that you brought that up. It's something that it's incredibly frustrating when you realize that the mechanisms in the system can be very slow-moving and inefficient. I was on a panel with a federal trade commissioner who was talking about what we could do from a regulatory standpoint to crack down on deceptive designs. They were talking about loopholes, how can we go find a loophole to ensure that this gets passed through.
It's wild when you think about how we can't just make a law that says this, we have to go through all these twists and turns. I think that goes to the conversation point about reform versus revolution. When you hear the word revolution, that can definitely be a very provocative word in some ways. Sometimes we do need to think about these things just like with the climate crisis. We can't go on the way that we're going on, something has to give and something has to change.
It's either going to be the environment or us. With tech, it's either going to be the tech companies or our mental health, and our safety, and our privacy, et cetera. I think that's why regulation when paired with the community organizing that's been happening in tech, the amount of unions that have been popping up within our industry in the last decade or so, or even five years, I dare to say, it's incredible.
I think that's something that is going to play a huge role in the momentum behind university students boycotting Palantir for example, that's something that's really important. Having that discourse around like, "Where do we draw the line as designers or engineers or whatnot? What is our stance and how can we refuse?" I think that when you bring in that holistic approach from regulation, from us as individual actors, but also community movements, that can be very powerful.
That's why I'm optimistic. It's hard to be optimistic, but I'm slightly optimistic because the stuff is happening.
[00:36:10] Erin: Yes, arc of history is long and all that.
[00:36:13] JH: The regulation thing I think is really tough because your point, it's hard to craft good regulation that actually is surgical enough to address the issue and get it passed, let alone. It also in a lot of cases has this unintended effect of benefiting the large incumbents because they have the lawyers and the resources to deal with the regulation and actually can stifle some of the newer entrants that are maybe more well-meaning or progressive in some of these things. It's really tough.
The other way you see it solved in some cases-- not solved but positive movement is through public funding of better alternatives and stuff. To go to the climate crisis one, a lot of that initial research money and everything else into solar and stuff like that helps kickstart that industry. At some point-- fingers crossed, it's just going to be the cheaper power option.
Companies that use a ton of power are not going to choose it because it's the moral thing, they'll just be like, "Well, this is the cheapest way to get electricity and we use a ton of electricity." You tip it over that way. Are there things in the public sphere that could go that way whether it's education or things like that, that can maybe also help here?
[00:37:14] Kat: I think one of the things that comes to mind is just the culture of open source technology. That is something that I have a lot of hope in. I've worked in an open source in the past in my last company. The idea that we can build a product that's for people by people. One of my favorite apps that I use for messaging, Signal, is an open-source product. I think there's a lot of potential there. If we can shift it back into the commons, that's something that's very promising.
Having something like an open source technology that of course has to be vetted for security and whatnot, but having that power, a lot of these functions that are increasingly being outsourced to private tech companies when they really shouldn't be could be a potential turning point. Especially when we were thinking about how a lot of healthcare things are now being shuttled off to private tech companies. That was a huge ordeal during the pandemic, which is still ongoing.
We saw in the UK there was lots of problems with privacy issues and whatnot when they use all those privatized contact tracing companies. I think going back to open source is something that we have to do and we have to invest in. I think that's also cool because it encourages the culture of building together. When I was working on that open source product, it was honestly amazing to see just people from all different time zones, all over the world willingly contributing their time to this.
Of course, we'd have to figure out ways to compensate people. Together, building something for a greater good was really cool.
[00:39:00] Erin: I feel like this is our in to dive into the blockchain. I know it. I can feel it. I know, I'm just saying we haven't even done an episode on blockchain, anything.
[00:39:10] JH: Maybe a real tangible example because I feel like you might know more about this than I do. The regulation around the cookie consent stuff and you have all those popups now of like, "Yes." Have those helped anyone or is that a well-meaning thing that just has not done anything? What's your take on that?
[00:39:24] Kat: Exactly. It's interesting because it's been talked about in Europe, where I live right now. Everywhere you go, every website you go to has a cookie banner. That has in some ways inadvertently or accidentally led to other deceptive patterns where manipulative patterns where companies who have to have this banner, they'll make it so that you'd have to individually uncheck every single box if you want to opt-out of that. Of course, most people don't have the time.
[00:39:56] JH: They just throw other stuff in there?
[00:39:57] Kat: Exactly. They throw lists of that and people would typically just accept the default, which is like they'll accept the cookies. There's been problems with that. I think it's one of those things where it goes back to the difficulties of crafting that kind of legislation that encompasses all those different potential loopholes. Well, people are creative. There's a lot of loopholes. I think it's one of those things where it's being addressed right now.
There's organizations addressing how they can redesign the cookie design, the cookie banner, and redesign that policy. It's something that when we are considering regulating other technological phenomenons that are annoying, we have to be just mindful of this past incident.
[00:40:48] JH: Right. There's a lot of unintended consequences that come out of these things.
[00:40:49] Kat: Yes.
[00:40:51] Erin: I definitely recommend the Chrome extension to block all the cookie blockers. This is the best extension.
[00:40:58] Kat: What? Okay.
[00:40:59] Erin: Yes. Because I don't have it for mobile, which I should see if I could find it, but it has made my life so much better.
[00:41:05] JH: Yes. I know there's just one for Safari too, probably get from mobile. It's called like StopTheMadness or something. It blocks out those things. There are auto exceptions around this. Not a great solution probably in other ways.
[00:41:14] Erin: Works for me.
[00:41:14] Kat: [laughs]
[00:41:15] JH: Yes.
[00:41:16] Erin: All right. Well, leave us on a-- Well, you don't have to leave us on a good note. You can choose the tone as you like. Again, what's the one thing or a one thing you'd like people to take away when they're thinking about ethical design?
[00:41:29] Kat: I think kind of going back to what I was saying the optimistic part, it can be hard to be optimistic, but find your allies, find your accomplices, find people that want to support you and can validate and stand by what you feel. If something's going wrong, they can be there in solidarity. That's incredibly important. We've seen this where in companies that have been pragmatic, the organizing that employees can do is very powerful. I think that is good.
Also, just as a designer or researcher, looking out for your mental health is super important. We're still in a pandemic. As we were saying in the beginning, this is still happening, and this work is hard. It's hard work, and so looking out for yourself and taking care of yourself and your soul, and nourishing yourself in that way is really important.
Sometimes if things are rough or difficult and you find yourself in a difficult pickle at a company or something, knowing that there's other alternatives out there, that there are other options and finding that community to back you up is good.
[00:42:50] Erin: Awesome. I'm looking, it looks like you have some resources on your website for finding communities that way as well.
[00:42:57] JH: There are so many small steps we can take forward. I think that was something you said earlier that really stuck with me. Just progress is good, so doing little things along the way do add up to bigger changes down the road. There's things that people can pull in and are worth doing.
[00:43:10] Kat: Exactly. I think people, at the end of the day, no one necessarily wants to be the bad person. No one wants to be a villain. Hopefully not. I think it's just a matter of figuring out how to get the ball rolling when it comes to change. Sometimes it means changing the configuration of incentives that's currently in place and looking at all of that nitty-gritty detailed stuff, which is a lot but very worth it.
[00:43:43] Erin: Thank you so much for joining us today. You've been a great guest and given us a lot of great stuff to think about. Thank you so much.
[00:43:50] Kat: Thank you for having me.
[00:43:52] JH: Yes, likewise. Excited to check out all these resources too.
[00:43:54] Kat: Yes. Take care, everyone.
[music]
[00:44:02] Erin: Thanks for listening to Awkward Silences brought to you by User Interviews.
[00:44:07] JH: Theme music by Fragile Gang.

Episode Video

Creators and Guests

Erin May
Host
Erin May
Senior VP of Marketing & Growth at User Interviews
John-Henry Forster
Host
John-Henry Forster
Former SVP of Product at User Interviews and long-time co-host (now at Skedda)
Kat Zhou
Guest
Kat Zhou
Kat Zhou is a product developer and designer who focuses on integrating ethics into the design of AI systems. Currently, she is the Senior Product Designer at Epidemic Sound. Before that, Kat was the Creator of the project, a Member Of The Board Of Advisors at The YX Foundation, and a Product Designer at Spotify and IBM. Kat is also a strong advocate for more inclusive and privacy-friendly approaches to AI.