Tech's Pre-seatbelt Phase, with Eva PenzeyMoog

Tech's Pre-seatbelt Phase, with Eva PenzeyMoog

Jerome Goodrich
Jerome Goodrich

August 05, 2021

When we think of technology breaches, we often imagine a nefarious hacker behind a computer screen far away. But for survivors of domestic violence and abuse, the most common perpetrators of invasive and harmful security breaches are people they already know and interact with. Reducing this potential for harm should be an urgent priority for technologists, but doing so will require new sets of standards, regulations, and a new way of thinking about technology in people's lives.

For the second episode of Collaborative Craft we interviewed Eva PenzeyMoog, who's on a mission to raise awareness about the interpersonal harm that can come from technology products and tell the stories of survivors.

In addition to being a principal designer here at 8th Light, Eva PenzeyMoog is the founder of The Inclusive Safety Project and the author of Design For Safety, which was released earlier this week by A Book Apart.

If you'd like to receive new episodes as they're published, please subscribe to Collaborative Craft in Apple Podcasts, Google Podcasts, Spotify, or wherever you get your podcasts. If you enjoyed this episode, please consider leaving a review in Apple Podcasts. It really helps others find the show.

This podcast was produced by Dante32.

Episode Transcript

Tech's Pre-seatbelt Phase, with Eva PenzeyMoog

Jerome: [00:00:00] Hi everyone. Jerome here. Just a heads up that this episode has content that may trigger some listeners. We share stories of domestic violence and interpersonal abuse. And we understand if you need to sit this episode out. We'll catch you on the next one. Thanks.

Thomas: [00:00:20] Hi everyone. I'm Thomas Countz.

Jerome: [00:00:22] And I'm Jerome Goodrich.

Thomas: [00:00:24] And you're listening to Collaborative Craft. A podcast brought to you by 8th Light. So Jerome, who are we talking with today?

Jerome: [00:00:33] Thomas, I am thrilled that we are talking to Eva PenzeyMoog. She's a user experience and safety designer, and founder of the Inclusive Safety Project. Her book, Design for Safety, is now available from the publisher, A Book Apart. Before joining the tech field, she worked in the nonprofit space and volunteered as a domestic violence educator and rape crisis counselor. Her safety design work brings together her expertise in domestic violence and technology, helping technologists understand how their creations facilitate interpersonal harm and how to prevent it through intentionally prioritizing the most vulnerable users. She works to make designing for safety the norm, and to bring fellow technologists into the effort to transform the tech industry into one that prioritizes safety, justice, and compassion. So basically she's like a super hero.

Thomas: [00:01:40] Yeah, I really want that to be my bio one day. I think I'd like to live in a world where that's a boring bio actually.

Jerome: [00:01:51] Yeah, heard. For sure.

Thomas: [00:01:54] Where that's just like, that's like saying I work in tech and I use a computer. Jerome: [00:02:02] Or I drive a car and I use a seatbelt.

Thomas: [00:02:07] Exactly. So I'm really excited to talk to Eva today because her work encompasses a bit of the kind of privacy and ethic sphere of tech that I feel like I was never really aware of. I've seen coded bias and I've experienced algorithmic bias, and I can see the arguments for privacy and reform for those reasons. But Eva's work is specifically around safety. And I think she calls it interpersonal harm, which is how people you know can use technology against you, versus how can a corporation or a government use technology against you. So it's a whole different perspective on the harm technology can cause, and I think I was ever aware of.

Jerome: [00:03:09] Yeah, exactly. It's both startling to see that safety in technology is something that like I would consider new and haven't really thought about. But also I'm very grateful that someone that we work with, and even just someone, is working to try and address the issues.

Thomas: [00:03:34] A hundred percent and I'm hoping that by the end of this conversation, we can all be working towards it.

Jerome: [00:03:42] Yeah, I think that's definitely the hope. Awareness is one thing. I would love to walk out of the conversation with some actionable things that I can do. And I'm sure that those are talked about in the book as well. And I can't wait to read it, but hoping that Eva has some stories for us.

Thomas: [00:03:58] So without further ado, let's head on over and chat with Eva.

Jerome: [00:04:03] Let's do it.

Thanks for joining us, Eva. We're really excited to talk to you today.

Eva: [00:04:12] Thanks so much for having me.

Jerome: [00:04:14] Just to kick it off, what does it mean to design for safety?

Eva: [00:04:17] So, designing for safety, when I use that term it's really all about designing against interpersonal harm. It's about identifying the ways that technology is already or will be used by bad actors and abusers to enact power and control or harass or stalk or different types of abuse over the people who are part of their lives. So usually it's. Intimate partners, but there's also issues with children and the elderly. Really anyone who is sort of in the domestic space or that you have access to in an interpersonal way.

Thomas: [00:04:52] Primarily you work with software products in the tech space. But it sounds like this idea of design, it encompasses designing anything, experiences or physical products.

Eva: [00:05:04] Yeah, my emphasis is definitely in the tech space, but even just in the work that I've done in tech, there've been ways that it definitely sort of bleeds out into these other areas of consideration. There's definitely a lot about service design, which is, you know, thinking about like an entire process. For example, Amazon. It's not just buying the product. It's also like getting it delivered and opening the box and opening the product itself and learning to use the product. Like all these other things that are involved outside of just the actual digital part. And there's definitely a lot of ways that that comes into play with designing for safety. There's one thing that I harp on a lot about in the book is customer support and needing robust customer support. And how sometimes you just need to be able to talk to a human who can understand like the nuances of your situation and give you some type of support. Maybe you just need access to your account. Maybe you need to kick someone else out. Maybe you need to explain like, what's happening and ask if they know what's going on. Like, there are so many different things that you might need customer support for, but a lot of companies sort of skimp out on that and kind of tell themselves stories about not really needing it. So yeah, there's absolutely non-digital parts to this as well.

Jerome: [00:06:25] It almost seems like an extension of security. And like a way that we don't normally really think about it. Is that accurate? Is that kind of how you describe it or at least get people on board?

Eva: [00:06:39] Yeah, I think that's definitely accurate. I sort of think of it as this threat model that isn't being prioritized or even thought about. Usually like we think about security and people are usually thinking about sort of like anonymous bad actors, nefarious hackers, what have you. But this threat model is like, what about the threat of the person who's living in the house with you, who already has access to your devices and who can use violence or threats of violence to get your passwords? You know, also this comes into play over surveillance of children and the elderly, you know, parents, people can just demand passwords. And that's a really common thing that we see all the time. So there is this threat model that sort of gets ignored for the most part. And it is really different from the things that security professionals are usually thinking about. And it requires a totally different sort of analysis of what the problem is and what the solutions might be.

Thomas: [00:07:34] Yeah. I don't know anything about what the numbers might be, but it seems like for consumer products, examples of abuse seem to be like much more likely to come from someone, you know. It's like, you think who's going to hack my smart home. You might not think, oh, Anonymous is going to hack my smart home, but it's like, oh, but people I know, or people who have access to it can abuse the privilege of having access to that kind of technology.

Eva: [00:08:04] Yeah, absolutely. I mean, there are examples of creepy anonymous people, you know, hacking into Ring cameras because they didn't require changing the password for a long time. And there's really creepy videos online of people, especially talking to children, like going into a ring camera in like a little girl's bedroom and saying that he's Santa Claus and really creepy stuff. But then there's, you know, a lot more examples of like an ex partner being able to watch someone. Ring also didn't log everyone out after a password change for a long time, until it was reported on. And there is one example that I talk about in the book, a man who broke up with his boyfriend, but his boyfriend had access to the ring and he changed his password. That's what all the guides, everything online tells you to do like kick people out by changing your password, but it never actually logged out everyone else. So his ex was able to still see what he was doing. And then he would also, you know, remotely ring the doorbell in the middle of the night so that the guy would have to get up and go downstairs and go to the door. And then nobody's there, and then he's losing sleep. So yeah, this is like, you know, there's very serious repercussions to this stuff and it's definitely happening all the time.

Jerome: [00:09:15] The thing that's crazy to me is the fact that it is so prevalent, but you're kind of like a trailblazer for this kind of thinking, and this kind of work. Maybe you don't think of yourself that way, but you're writing the book on it, right? So, I'm curious about like how you got to that point and when you recognized that there was this gap?

Eva: [00:09:37] Yeah. So, yeah, when I first was interested in this, when I was first a junior designer, recognizing that this was definitely a problem, I looked around for a group. I was really sure that someone was already working on this and I just needed to find this group and I could sign up to volunteer and then I couldn't find them. And I was like, well, I guess if I want to contribute to tech, not being weaponized for domestic violence and you know, other interpersonal harms, I need to try to do something about it myself. So yeah. I guess I am. Or well, I don't know, if there was a question there about the trailblazer thing.

I've definitely like, there's a lot of really, really smart people doing very related work that I've been able to sort of build off of and learn from, which has been really useful. There's- Leoni Tanczer is a woman who's sort of, a huge, really important researcher doing research into Internet of things abuse, specifically, which is really important. There's a woman named Molly Dragiewicz in Australia who is researching sort of tech abuse more generally. So there's definitely people in the academic space who are working on this stuff, but I couldn't find anyone who was like actually in tech. And I think there's a big problem with... The really important things that are being studied in academia don't necessarily translate into technologists, you know, actually even knowing about this research, much less doing anything with it. So my approach has always been to try to make it like very practical and realistic. Like, this is exactly what you can do. You can think about all these things, like while you're actually at work. And that's- I have a whole chapter in the book about how to implement the- I call it the process for inclusive safety, which is just the process teams can go through to start to identify and prevent these types of harms. Very hands-on specific guidance, basically.

Thomas: [00:11:29] I want to talk more about the content of your book for sure. And definitely a lot of the practical things that people can do. And, for sure, people should read your book, but I'd also like to dive into some of that. But before we do on this topic of kind of trailblazing, we talked earlier about the vulnerability behind that and, you know, your work represents a lot of the stories that you've encountered through your work. And if there isn't a ton of peers out there who are necessarily focused on the same kind of practicalities as you are, it's kind of a lot of responsibility. So I'd love, if you're willing, to talk about the experience of authoring this book and finding the courage to speak out about design flaws in the huge companies. And then also how you find your footing in terms of feeling like this is the right thing or that you're going down the right path or your suggestions are ones that you really believe in.

Eva: [00:12:29] Yeah. So vulnerability, definitely a constant thing for me in this work, it is really vulnerable and I feel vulnerable pretty much all the time because there's no one that I can go to, to get a stamp of approval to say like, "yes, you're doing the right thing keep going," or "no, this isn't quite the right direction. Why don't you do this instead?" Like, there's no one that I can go to for that type of validation, which is really, really tough. I do have a mentor who kind of was taking on big issues in tech, who is great. The thing that I do do that helps a lot is I do talk to survivors a lot, as well as advocates who work in the domestic violence space to kind of get feedback and run ideas past. Which is really useful and obviously really important for anyone who's going to center a group in their work. You need to actually center that group in your work and make sure that you're not going to do something that's inadvertently going to cause harm. And that's something that I think about a lot too, is like how stuff like this can really backfire. It's not quite the same, but something that the design team at 8th Light and I were talking about recently was some of the issues with how the desire to bring non-designers into the design practice, like, as you're going through the design process, you know, bringing stakeholders in for a workshop and these different things that people do, how that has had some like really bad unintended consequences of possibly like devaluing designers and the work we do and making people feel like it's not real work or anyone can kind of do it, which totally isn't true. But it's kind of shaken out that way in some ways.

So yeah, I am thinking about like, could this possibly backfire and like end up doing more harm? So that's obviously scary and would be kind of my nightmare scenario, but I think overall I've gotten enough positive feedback from survivors and advocates in the space that I feel confident that it's going to be a net positive and help people understand these issues and that they're happening in the first place. And then that there is something that they can do about it to prevent it. Or if, you know, sometimes prevention is impossible, but there are often ways that you can give power and control back to the survivor in that situation. So helping people understand those things I think is going to be a net positive, but it is something I'm sort of constantly thinking about, and honestly feeling a lot of imposter syndrome about. Sometimes it's just like, who am I to be doing this? But then I'm like, well, someone has to, and I'm kind of equipped.

Thomas: [00:15:08] Yeah. I just- it resonated so deeply with me, you know, as a gay man in tech, as a black person in tech, I feel like I do kind of advocacy work where I can. As much as I can. And I feel imposter syndrome about that too, even though I'm like, I'm part of the group that I'm trying to represent, but can I be a voice for the group that I represent? And should I be? And what is my responsibility to make sure that I'm being as inclusive as possible? And yeah, I can only imagine that this is a very sensitive and a highly impactful and important group of people to be representing and advocating for. And there's a lot of sensitivity that has to go in with that.

Eva: [00:15:59] Yeah. I mean, I do want to say, I did have a lot of experience with this type of content before coming into tech. Like I'm not a total, like, I don't know. I've had- there was like a guy recently who wanted to do my talk at a conference, which I do have volunteers who do that because me just conference speaking isn't a good way to scale this work, so to speak. So I do have volunteers in different spots around the world who are taking the talk and making it appropriate for their own culture and their country and doing it in their specific contexts and locations. But this man, he didn't have any experience with domestic violence. He wasn't in any way, connected to any type of knowledge base with it. And, you know, I was like, "no, actually I would not be comfortable with this because I'm not sure that you would be able to deliver this content in a way that is appropriate and sensitive," and so, you know, I want to say, I do have that background. I guess I did think a lot about, am I the right person to be doing this? And the answer I came to was yes. Or I wouldn't be doing this. And then it was kind of like, well, if I can do this and I should do this, then how do you not do it when it's something really important?

And maybe you feel similarly, Thomas, like, it is scary, but if you can and you should, then you kind of have to. Is at least how I think about.

Thomas: [00:17:22] Yeah, that's such an admirable perspective. And I don't think I would have put it into those words before, but yeah, that really resonates with me.

Jerome: [00:17:32] Yeah. Thank you both for that moment of vulnerability and sharing that. I really appreciate it

Thomas: [00:17:40] Because we talked about your experience, Eva, I'd like to talk about your experience with the work that you're doing. And I know we haven't gotten into the framework that you've included in the book, the process for inclusive safety, but I'd like to talk about your experience with this type of design or bringing this type of content to design in the work that you do. And how it's been received or how you thread it in. You've mentioned earlier that it's an often ignored threat model. So how have you found a way to, I don't want to say incentivize cause that that might sound like there's bad intentions, but how do you bring the content to people who generally don't see it?

Eva: [00:18:21] Yeah. So I think first of all, people just don't know that this is a thing that's happening. That's always been when I was doing my conference talk design against domestic violence, which is sort of what led to the book. Lots of people would say like, oh, I had no idea that this was happening. Like I just never knew. And then usually something along the lines of like, now I'm going to be thinking about it all the time. So I do think that it is an issue of raising awareness. And giving real life examples. I think something that I really prioritized in the talk and the book is telling stories. And the nonprofit that I started my career in had the saying that if you want to communicate powerfully, tell a story. And I still think about that all the time, because like, that's just how people remember, that's how we passed information for a long time.

So I have just tons of stories, basically in the book and in the talk about different, real life people who have gone through different experiences with technology, facilitated violence and abuse. And telling those stories, I think is a really powerful way to get people to understand what's happening, as opposed to just did you know that people will change a Nest thermostat from far away and then, you know, gaslight their partner and tell them that it wasn't them, that they're just not understanding how to use the device and they're stupid and they can't trust their own reality. You know, instead of saying something like that, like tell the story of like, what is the actual impact of that? Like someone who went out for coffee and then their abusive husband who was away on a business trip, turned the heat up super high when her dog was home. And then she comes home and her dog is panting and has drank all the water. And she's like, "Oh my god, did I do this?" You know, there's like these very real consequences and impacts of these things. So I think actually telling those stories is really, really important.

Jerome: [00:20:16] That just makes a ton of sense to me, too. You start with the stories to generate awareness. And as more and more people become aware.

Now you have other people thinking about the problem you have other ways to, kind of, potentially combat it. You start thinking more on a systemic level as well, right? And maybe that's the segue into talking a little bit about the book and design for safety and the process of inclusive safety. I think Thomas and I are both just really curious about what that looks like in the wild. Is there a real world example of implementing this process that you could potentially walk us through?

Eva: [00:20:54] Yeah. Okay. So for the actual example, I actually just did this with the design team at 8th Light, because one of my teammates is working on basically a side project that he's working on with his learning time. And it's going to be like an internal site for employees to collaborate with each other and to get to know each other better, especially like new employees for existing in place to get to know them and new employees to get to know people. It's going to be really cool. So it was a total of two or three hours. First, we got a demo of the site and kind of learned, you know, he showed us the features he had already implemented and, and the roadmap of what was going to come. And then I led the team in a brainstorm about it. We discussed all the ways that a bad actor or an abuser could use this product for harm. And we talked about how someone could see the different groups that an employee is affiliated with and their interests and their past experiences, and use that knowledge to sort of slide into the DMs as it were.

We talked about someone making a new group that was specifically harmful or combative. We talked about using, you know, open text fields to enter content that was abusive or even just inappropriate. Also we identified a lot of issues that didn't necessarily have to do with someone wanting to weaponize this product for harm, but that would still be sort of passively enabled through the actual design of the product itself. So, things like, what if there's a trans person who wants to change their name, but then they're not really able to, or they're not able to even figure out who to talk to about it. And something like the information that is still there after someone has left the company and do they have control over that?

So issues that aren't necessarily a bad actor trying to do harm, but would still be a harm, nonetheless, just through the design. So, yeah, the next step was to come up with solutions to these problems. So yeah, we talked about giving employees control over the content that they share, ensuring that nothing is actually required for people who don't want to share anything personal. That was a case of, we can't necessarily stop someone from using the information in a nefarious way, but we can give people control over the information. Like if you're concerned about this happening, or you've had this happen in the past, you know, that person can just opt out. And then, this gets back to the sort of service design thing we were talking about earlier. If we have open text fields who at the company would be managing those responses and how would we support them and understanding what was expected of them in terms of, you know, their judgment on what's okay and what's not okay. And then a solution to this was maybe we just don't have open text fields at all. Like maybe we just get rid of those totally. We talked about like giving people control over the information that remains when they leave the company. The process in place if someone gets fired for being harmful, maybe their stuff just gets completely taken out. And then we talked about things like having text that says, if you want to change your name, like contact this person, or just giving them the ability to change their name in the first place.

And this was something that also came up with talking about pronouns and how it would be great to have pronouns. And then we kind of talked through like, maybe they shouldn't be required though, because maybe not all trans people are ready for all of their coworkers, you know, on day one of their new job to know their pronouns or how maybe it's just a little more complicated than just putting a pronoun into a field. So making that not required. And this is just a couple things. There were like, way more than just the things that I'm mentioning here, which was really interesting because it's such a small project with such a limited scope and a very closed group of people. You know, we have all these institutional guardrails in place that kind of discourage most of the abuse, you know, like people want to keep their jobs. People want to get along with their coworkers. Like even in this scenario, there were still so many realistic possible harms that came out of it that, you know, you can imagine what it's like for a bigger product with like a really big, broad, totally open user base. There's so many more issues to work through, but it really is something that you might think, oh, my product probably doesn't need this, but it probably does.

Thomas: [00:24:58] I love that you mentioned that because you've mentioned some products that exist out in the world. You've mentioned Nest and Ring from Amazon. I'd like to, if you are willing, to give us a little reaction to AirTags from Apple that came out recently. Now I know that there have been some updates since, but I also know that when they first came out, there were quite a lot of feedback around these kind of concerns. And if you're willing to share any of your reactions to-

Eva: [00:25:31] Oh, I'm willing.

Thomas: [00:25:31] -how that, how that design seemed to go.

Eva: [00:25:35] Yeah. Oh, I'm so glad you ask about this because I actually tried to like shoe horn, a new- a brand new section into my book, like after, you know, it was pretty much done about AirTags. And my editor was like, you know, okay, it's too late like you can't keep adding things and had to kind of pry the draft out of my hands. So I'm glad I have the opportunity to talk about this because yeah, it was a big deal and it was really disappointing because Apple has in the past done a good job of including domestic violence, advocates and experts in their design. So it was sad and just surprising that this happened. But yeah, so AirTags for anyone who doesn't know, it's essentially a tile competitor that you can affix to like a key chain. You know, your umbrella is an example that they give, you know, put it in your backpack and then it uses Bluetooth of, I believe like all iPhones and connected apple devices that have Bluetooth to sort of create a network so that you can find it if you lose it. And they had considered stalking potentials and kind of had a solution, which was, if you have an iPhone. And, you know, it uses its Bluetooth. It recognizes that there's an unknown air tag moving with it for three days, then it would sound the sort of chirp alarm.

And I think people immediately were like, that is a long time. A lot of damage can be done in three days. And also, you know, once again, it really ignores the threat model of domestic violence. I think it was considering an anonymous stalker, like someone drops an air tag into someone's purse at a coffee shop, you know, it's not- he doesn't live with that person. Which you know, that stuff absolutely does happen, but it's just, it's kind of like the rapist in the alley trope. That's just not what it usually is. Most commonly it's someone you know. And it's the same with stalking in these types of scenarios, it's, you know, husbands, stalking wives, you know wives stalking husband, people stalking each other. And so another concern with AirTags was what about if... so, if I put it in my husband's backpack and watch where he's going throughout the day so that I can make sure he's not, you know, going to his friend's house who is onto me and my abuse and is, you know, always saying that he should leave and I've like told him he can't see that friend. So I'm going to monitor him and make sure he doesn't go to that neighborhood or whatever. But then he comes back home at night. Now the three-day window has reset and he's not going to get- the alarm's not going to go off. So like, what about that scenario?

What about when it's in your car and you move away from it and then you come back, like, it's just going to track the location of the car. There are like so many different things that they didn't consider. And the good news is that they kind of designed it so that they can update it later. They tied it to an API and not the actual hardware. So they were like, we can update this later if we get it wrong. And everyone was like, okay, well, you got it wrong. Like you need to update it. So they did. They made it- I believe now it's after 24 hours, it'll chirp and give the alarm, you know, and they're going to create an Android app. That Android users can download to see if an unknown air tag is following them, which assumes so much tech literacy. You know, my questions are like, how are you going to promote that? How are you going to make sure people know, how are you going to educate people who don't know, you know, what this apple product is if you're an Android user? You probably don't care that much about apple products. You know what I mean? So if there's just so many issues still, so it's good that they were somewhat reactive to all the criticism, but it's, there's still a lot, a lot of work that's needed.

Jerome: [00:29:03] Yeah, not to mention you might be introducing kind of new vectors for threats to reach somebody by creating another app or something like that.

Eva: [00:29:12] Right.

Jerome: [00:29:12] It's such a thorny problem and I can't help but get a little discouraged when we talk about this stuff, just because, I mean, you were talking about, you know, the small kind of internal side project and just how many potential issues that could be with it. And for me, it's like a question of, well, if on these really small side projects, there's so many issues. What can one team really hope to do? It seems like these are systemic issues. And I guess I'm curious if there's ways that you see policy or ways to kind of tackle some of these entrenched systemic ways of thinking that are negatively affecting people's safety and using technology.

Eva: [00:29:57] Yeah. I'm so glad you asked that Jerome, first of all. Yeah, it is overwhelming, but it doesn't have to be. And that was another goal with the process, is breaking this down into manageable chunks. And what I did with the 8th Light team, it was a few hours. Like it doesn't have to be like a ton of work. And I hope that the book really makes it clear and somewhat simple and just takes a lot of the guesswork out of it. I think, to go through the full process, I have time chunks, like estimates so that you can give that to a stakeholder if you need approval. And it's like three to four days to go through the whole process. So it doesn't have to be that much. And obviously if you have an enormous product, it's going to take a while to sort of retrofit for safety, but it's definitely possible. And it doesn't have to be this like overwhelming thing. So I just want to put that out there first, before I answer the rest of your question. It's absolutely achievable to design for safety and it doesn't have to be that hard.

Jerome: [00:30:54] Don't despair.

Eva: [00:30:56] Yeah don't despair, basically. So with your question though, the thing that I think about a lot is that what I'm advocating for with all of my work essentially, is that individuals and teams opt into doing this and that they bring their stakeholders along. They educate, they help people understand why they need to do this. And then they do this work. Obviously, if I'd thought that that was pointless, I wouldn't be doing it. I think that, you know, individuals and teams do have a great deal of power to make this stuff happen, but there are some obvious limitations to that. You know, if you work at Facebook, which has kind of shown the world, that you can do a lot of bad things and not really have any consequences, you know, there's no accountability. Sometimes they'll get sued and make a change, but no one, no one is actually being held personally accountable and there's no incentive to prevent harm in the first place. So we're kind of in this phase where, there are no government mandates saying, "Hey, you have to think about this stuff."

And I think maybe you both have heard me talk about this before, but I'm kind of obsessed with the history of seatbelts because it's very instructive about, like automobile designers, they can't design a car without a seatbelt right now. But getting to that point was actually this really long, intense struggle. That was still ultimately pretty recent. Car companies didn't require them. They would sell them for an extra cost. And like 2% of people bought them in the fifties. And it took decades of activism for people to really revolt against car companies putting their profits ahead of the safety of their users. And I think in tech, we're still in this sort of like pre-seatbelt phase, where people can just do whatever they want. So, to answer your question, basically, I feel like ideally, this would just be mandated that actually, you know, you have to be considering safety and the harms that your product is going to enable, and you have to sort of prove that you took some type of action to identify and then prevent them. And then you have to have a plan in place and be held accountable to quickly responding when something happens that you didn't anticipate.

Because I think that will happen even with teams who really do try. Like I talk about being able to have something in place so that when users are recognizing a harm, you're able to very quickly and gracefully, like take that feedback and respond and change the product to be safe. There is a bill right now, it was introduced in May, I believe, the Algorithmic Justice and Online Platform Transparency Act, which is all about algorithms and how they can't discriminate on the basis of a protected class. So race, gender, age, ability. I find that really exciting and it's actually like a really good bill. And it would basically say like, you have to consider these things and that there'd be some accountability if your algorithm is perpetuating racism. Which there are many that are right now. So I see this and I am hopeful that there's some type of mandate coming, but I think we definitely can't count on that. And in the meantime, it needs to be individuals and teams doing this work and pushing stakeholders and people who might be reluctant to give more time and space to this to do so.

Thomas: [00:34:04] I love that. You're definitely speaking my language. I don't know what that says about me politically, but I'm all for these regulations in all aspects of what we see in tech right now. I love that analogy, the pre-seatbelt phase. I will steal that, but I'll steal it in a way that I will definitely quote you and reference and say that I got that from you. Cause I think that's such a beautiful analogy. As we are nearing the end of our time, I'd like to just know what you're curious about right now or what you're excited about right now. I mean, your book is coming out soon. That's obviously really exciting. So in this space, like what are you really curious to tackle next or investigate next? And what's really got you kind of motivated right now?

Eva: [00:34:56] Yeah, I actually am thinking a lot about this stuff we were just talking about. About what are the bigger system-level changes that need to happen. I've been thinking a lot about the history of the seatbelt, which I have been for months and thinking like, how do we recreate that in tech for safety, inclusion, and compassion. Those are the three things that I think we're talking about when we say ethics. So I'm trying to be more specific and not just say like ethical techs. I think that's a little vague or it has maybe lost its meaning in some ways.

So I talk about safety, justice and compassion- sorry, inclusion is part of justice, I think in this- this is all still very like rumbling around in my head. But yeah, I'm thinking about that a lot and I'm working on a new talk that's about how can we create a similar paradigm shift to what happened with seatbelts, where people went from not buying them and not really caring, but still tons of people dying in traffic fatalities. And then, you know, how do we go from that to over 90% of people now voluntarily buckle up every time they drive. Seatbelts are just standard. You know, we have a whole huge government agency that sets laws and standards around highways and automobiles and everything. About the infrastructure with them. None of that was just given to us like the activists and everyday people fought for all those things. So I'm really curious about how to get there with tech and like what the next 20 or 30 years looks like getting to this place where tech has transformed the pre-seatbelt phase to the post-seatbelt phase, so to speak.

Jerome: [00:36:33] And just as a final thing, how can people contribute if they want to get involved?

Eva: [00:36:38] Well, you know, buy the book. It's available for purchase on August 3rd, it's from A Book Apart. So buying it from abookapart.com is definitely the way to go. And that's gonna be, you know, be best for everyone as opposed to saving a few dollars to get it from Amazon or whatever. By the way, I talk a lot about Amazon in my book. It's just, it's an evil company. So if you can avoid buying my book there, if you're able to, I recognize that not everyone has the financial privilege, but if you're able, get it from abookapart.com. Also, I do have volunteers like I mentioned. So if you're interested in bringing this work to like your own country, or even just a different part of the U.S. Because I'm not trying to keep all this to myself and just do conference speaking, and that's an impossible task. Especially, as a white woman, I don't think it's super appropriate for me to go to Brazil or China or whatever, and say like, here's how you should think about domestic violence. Like it's a very sensitive, nuanced cultural thing. So you can contact me if you're interested in becoming a speaker and just having a talk like handed to you. There's a contact page on inclusive safety.com. And then the final thing that I have to do this shameless plug for is that you can hire me to help you do this work, to do a safety audit, like I talked about, which is just a few hours. You can hire me to help you implement the process at your company. So yeah, you can go to inclusivesafety.com to learn more about that.

Jerome: [00:38:02] Awesome.

Thomas: [00:38:03] Amazing. And thank you so much. Like you mentioned at the end, you're not trying to kind of keep this all for yourself and that's so obvious in the way that you're so willing to share and to give and to listen. And I appreciate this time with you and having this conversation. So for my part, thank you so much for joining us for this conversation.

Jerome: [00:38:25] Yeah. I second to everything that Thomas has said, thank you so much. The work that you're doing is just so important and it seems like so obvious and yet, you know, just like seatbelts, right. I'm really hopeful and optimistic about the future. It seems like the approach that you're taking is really thoughtful and yeah, just thank you. Hopefully we'll talk again once your new talk comes out. And I think this is something that we want to keep on the forefront of everybody's mind.

Eva: [00:38:57] Yeah. Thank you so much for having me. This was, this was a really fun conversation.

Thomas: [00:39:02] I feel empowered. I can't wait to read your book and learn more.

Jerome: [00:39:05] Same, oh my god, yes.

Thomas: [00:39:15] Normally Jerome and I spend the last few minutes of each episode threading in what we learned from our guests into our own experience and inviting you, our listeners to do the same. This time, we'd like to leave you with the opportunity to reflect and investigate the resources that Eva shared with us in the show notes. For our part, we'll continue to stay curious and begin incorporating design for safety into our daily work. Thanks again to Eva for joining us on the podcast and for all the work that she's doing to make our technology safer. Here she is again with some final thoughts.

Eva: [00:39:49] You know, there's so much content out there and I've seen so many talks like this, that educate you on the problem. And then you're fired up and you're like, yeah, I want to fix this, or I don't want to do this accidentally in my work. But then there's not usually a lot of like, specific follow through. Or they're sort of like vague, like consider racism, consider gaslighting, and I'm always sort of like, that might work for some people. For me, it just doesn't work. I want to know, like, when do I do that? You, you can't think about gaslighting eight hours a day while you're at work. Like when am I going to spend the time thinking about it? What does that look like? What's an activity I can do? So anyway, that's why I kind of- how I started making this process is I started documenting, how can I actually do this stuff? And then it sort of has evolved over the last three years or so that I've been doing this. So my point in saying all this is that I really urge people to try to think through, like, what are the actual solutions? What can people actually do that's not just considering something or like, knowing about it? That is the first step. And it is really important to know about the issue. But then after that, you know, if you want people to really change it, try to give them something specific. So that's my goal anyways.

Jerome: [00:41:07] Thank you so much for listening to this episode of Collaborative Craft.

Thomas: [00:41:11] Check out the show notes for our link to this episode's transcript, and to learn more about our guest.

Jerome: [00:41:16] Collaborative Craft is brought to you by 8th Light and produced by our friends at Dante32.

Thomas: [00:41:22] 8th Light is a software consultancy dedicated to increasing the quality of software in the world by partnering with our clients to deliver custom solutions to ambitious projects. Jerome: [00:41:32] To learn more about 8th Light and how we can help grow your software development and design capabilities, visit our website at 8thlight.com.

Thomas: [00:41:39] Thanks again for listening. And don't forget to subscribe to Collaborative Craft wherever you get your podcasts.

Jerome: [00:41:45] You can also follow us on Twitter at @CollabCraftPod to join in on the conversations and let us know who you'd love to hear from next.

Thomas: [00:41:52] We'd love to hear from you!

Jerome: [00:41:54] Bye!