Conversations on Applied AI

Eri O'Diah - The Role of Human Bias in Artificial Intelligence

February 01, 2022 Justin Grammens Season 2 Episode 1
Conversations on Applied AI
Eri O'Diah - The Role of Human Bias in Artificial Intelligence
Show Notes Transcript

The conversation this week is with Eri O'Diah. Eri is a seasoned strategist who has been applying experience as a creative at the intersection of Black, immigrant, millennial, women to deconstruct biased systems and advocate for equitable social outcomes. A leading priority of her work currently, is the study of unconscious bias and how deep learning and emerging technologies can be used to better understand and accelerate the deconstruction of cognitive bias and reduce the impact on marginalized groups.

Her career highlights include more than 10 years in marketing, entrepreneurial experience; developing and managing complex projects, being 2019 LLS Woman of the Year Nominee, and named 2020 Emerging Business Owner by the National Association of Women Business Owners - Minnesota Chapter.

She is a graduate of California State University Northridge with an extensive background in digital marketing strategy, social media, and web development. In her free time, she can be found in a yoga class or volunteering as the regional director of BDPA-Midwest. 

Most recently Eri has founded SIID. By harnessing the power of artificial intelligence, SIID™ Technologies applies big data and emerging technologies to evaluate, uncover, and correct the influence of human biases on decision-making and communication. You can learn more at SIID.AI

If you are interested in learning about how AI is being applied across multiple industries, be sure to join us at a future AppliedAI Monthly meetup and help support us so we can make future Emerging Technologies North non-profit events! 

Resources and Topics Mentioned in this Episode

Your host,
Justin Grammens

Eri O'Diah  0:00  
The idea of SIDS just start to formulate back in 2019. And in 2020, with the incidents that took place here on the murder of George Floyd, the pandemic, and everything that followed up really solidified the path that I needed to take and what needed to happen, and how soon could possibly make a greater impact here in Minnesota and  elsewhere.

AI Announcer  0:32  
Welcome to the conversations on Applied AI podcast where Justin Grammens and the team at emerging technologies North talk with experts in the fields of artificial intelligence and deep learning. In each episode, we cut through the hype and dive into how these technologies are being applied to real world problems today. We hope that you find this episode educational and applicable to your industry and connect with us to learn more about our organization at applied Enjoy.

Justin Grammens  1:03  
Welcome to the conversations on applied AI podcast today we have Eri O'Diah areas the seasoned strategist who has been applying experience as a creative at the intersection of black immigrant millennial women to deconstruct bias systems and advocate for equitable social outcomes. A leading priority of her work currently is a study of unconscious bias and how deep learning and emerging technologies can be used to better understand and accelerate the deconstruction of cognitive bias and reduce the impact on marginalized groups. For career highlights include more than 10 years and marketing entrepreneurial experience developing and managing complex projects. She's a 2019 LLS Woman of the Year nominee, and was named 2020, emerging business owner by the National Association of Women Business Owners, Minnesota chapter. She's a graduate of California State University Northridge with an extensive background in digital marketing strategy, social media and web development. In her free time, she can be found in a yoga class or volunteering as a regional director of BDPA. Midwest, most recently area has founded, said, By harnessing the power of artificial intelligence, Sid technologies applies Big Data and emerging technologies to evaluate, uncover, and correct the influence of human biases on decision making and communication. You can learn more at s Welcome, Mary, thank you so much for being on the program today. Thank you for having me. Awesome, awesome. Well, I gave a little bit of a background here about yourself where you came from what you're what you're currently doing, did you want to fill in the blanks at all with some of the things that you're doing currently today. And

Eri O'Diah  2:34  
sure, through said technologies, we're currently engaged in that MIT Sol program that's focused on bundling and reimagining public safety. My background really originated from the entertainment industry, ironically, and format entertainment and television spent a little over a decade in LA working the studio level, working from a production to alternative programming, which is really reality TV. And then finally, and feature films where I was able to gain some experience and script reading at my sweet spot was really in the social media realm, developing campaigns moderation, I love getting into the leads in the comments section.

Justin Grammens  3:25  
Okay, and one of those types of people hmm, I

Eri O'Diah  3:29  
dive in real deep, and I lean in real hard. That's where I where I gain much of my insights on, you know, public, sentimental social sentiment, and all of that. So that is my background, how I actually ventured into the AI space was through my work at collectively digital, which is a digital marketing agency that I lead as well. And back in 2018, we had the opportunity to provide services to the NFL and their affiliates while they were in town for the Superbowl. And I was able to attend a conference called sports con, where the discussions were all about the stadium technology and how they're, you know, using some of that the opportunities to leverage stadium technology to drive POS system engagement or sales and all of that. And that's really where I sort of began to learn about the real life capabilities of AI. Prior to that, you know, marketing, automation, scheduling, CRM, things like that. I was familiar with that. But really, the capabilities from a state of tech perspective was was foreign to me. At that time in 2018, was also the height of the Black Lives. Matter movement. Kaepernick taking a knee as well as Trumpism was on the rise and I was privy to quite a bit of interesting conversations or lack of conversation, I really wanted to make an impact. And I didn't think that another mark tech solution to help businesses better sell and make money was the right way to go, I really wanted to do something that would change the trajectory of my work and my quality of life and those that look like me. So the idea of SIDS just start to formulate back in 2019 and 2020, with the incidents that took place here, the murder of George Floyd, the pandemic, and everything that followed, really solidified the path that I needed to take and what needed to happen, and how it could possibly make a greater impact here in Minnesota and and elsewhere.

Justin Grammens  5:55  
Nice. Well, you seem like somebody who likes words, I guess, right. You mentioned about getting into the getting into the comment section. Right. And so can you talk a little bit more than about how said maybe applies to just everything that you sort of talked about to begin with? And and sort of its how its how it's fitting in with regards to communication, I guess.

Eri O'Diah  6:15  
Sure. One of the things from my background and entertainment and marketing space, and my work and collectively digital. One of things that I've noticed over the years was the bias that was perpetuated by the marketing and media industry, you see that in the lack of represent representation, or even the stereotypical representation, right? The language used can be very offensive and culturally insensitive. You can see a lot of culturally insensitive speech and by a speech within comments just need to read through, broadly social media comments and area and you'll see a lot of that from a social perspective. But beyond that, brands are perpetuating quite a bit of biases, right? You had in 2017 Superbowl commercial by dove that had a woman of color getting clean and beautiful by taking a skin tone shirt off and becoming a white woman. You have Gucci that actually took a black face turtleneck from production to market and had to rein it back. I mean, I can go on and on. Even last year, after George Floyd incident, KFC actually pushed out a turn of Daddy campaign that had the Black Power icon as the shadow of a fried chicken drumstick, I mean, you might have swapped the watermelon and orange soda on top of that, right? I call it a marketing campaign. Even most recently, I'm seeing I mean, you only need to look at the bias and look at a lot of marketing and media agencies by just looking at the team. You know, when you have an all white team, creating marketing and branding campaigns for a very diverse consumer market that we have in this country, you start to question and you look at that team, and you're like, clearly this company, this brand, this agency has a problem with inclusivity was why I mean, the fact that you don't have a problem, you look at your team, and your team does not even represent your consumer base. That's a problem. And that's what a lot of these agencies have. And which leads to much of the marketing being by as much of the language you use, the target audience is skewed to predominantly Caucasian, you know, and you have the othering of every other group, right. And you have teams that are they're leaning on stereotypes to create campaigns that they believe would resonate with a diverse team because a diverse group of consumers because they have no idea they have no perspective. There's no one on the team to get any perspective. From my experience working in the entertainment industry, I was relegated as that one diverse person on the team. I was there to confirm stereotypes like like how, you know, how do you say this and urban slang or whatever, when I don't even talk like that? I don't know. I'm as clueless as you are. Right. Right. Where lead generation Thank you. But I came to a point where you know, the hiring process was that I was black. There was a problem then I was too black. Oh, that's not black enough. Right? I was enough, because I couldn't confirm certain stereotypes. And that was a problem. So where do I go? I'm black. That's a problem. And then I'm not lucky enough. To put someone like that, where do I fit?

Justin Grammens  9:58  
You think most of these groups just ignorance, or do they just really not, not care? Just not know what, what's what race? Okay.

Eri O'Diah  10:07  
Oh, they know, I think it's it's it's part of the challenge in this country and in many spaces to have real conversations around biases, right? And acknowledged that there's bias acknowledged that we have we all have them and they influence our decision making. When you see teams that are predominantly, if not all white, then wonder, what is the decision making of the HR manager, the hiring managers, what are those decision making process that prevents me? You can't tell me there isn't a qualified person of color out there that they could hire?

Justin Grammens  10:45  
Absolutely. Right. Right. Yeah. So it's throughout the organization, typically,

Eri O'Diah  10:49  
right? So it comes down to conscious bias, because we're leaning on on unconscious bias, because it's safe. And what is hard for us to admit as human being is that we are consciously bias, that we have hate in us that we make these decisions based on hate, and fear. And that is, that is a conversation no one wants to have because there's this fear, right? Oh, well, I may, you know, I've maybe labeled a racist and all those things. And and I say, so what? Now, you know? Now, you know, your true racist decisions, not correct yourself. Like what with awareness comes the opportunity for you to rap and make changes, without admitting there's a problem and bringing awareness to the problem, we can't make any changes. And that's kind of where, where I personally seen Minnesota, specifically, is this this Minnesota Nice, where we don't want to address the elephant in the room, right? We don't want to come to terms with who we are. And therefore we don't change. Therefore, we have a human being on the streets of Minneapolis, being murdered in broad daylight. Right? Camera, and that is our fault, as Minnesota and that is on us, because we allow this to happen. We allow it because we're Minnesota Nice. We're not gonna say anything, we're gonna turn on our face away, we're gonna close our ears and pretend nothing wrong is happening, when literally human beings are being murdered on the streets. That said,

Justin Grammens  12:33  
I agree. I agree with everything, everything you're, you're you're saying here, and I commend you for trying to bring technology, I guess in to the conversation to try and help, I guess, have a third party, I guess, right, I'd say have have a have a unbiased, maybe opinion or view. And so we can start having these conversations. What are some words that maybe describe you? I say? I mean, I mentioned that at the beginning, you know, you seem to be quite into social outcomes and activism. But I don't know. Tell us a little bit more about about yourself.

Eri O'Diah  13:04  
The words that describe me passionate, emotional.

Justin Grammens  13:10  
sense that,

Eri O'Diah  13:12  
I would say, a great amount of empathy, in the sense to my own detriment, I think, audacious African?

Justin Grammens  13:20  
Yeah, for sure. Those are all very, very good qualities. And I think kind of something that every entrepreneur needs to have, as you're, as you're setting out to sort of chart chart the course with your with your new AI startup. How would you define AI?

Eri O'Diah  13:37  
And will different define AI as, as the assistance of machine machine learning is, to me as what AI means a lot of people feel that AI or machines can't be intuitive. I beg to differ. You know, I think there will be a time where that machine can be intuitive. A think that we're trying to leverage the assistance of machine to, to bring back a certain level of humanity, and how we interact with each other. Which is why think that there will be a time where machine will become intuitive, whether that's good or bad. I'm not gonna debate. But, you know, I think with with a certain level of AI ethics involved, we can set parameters around safeguarding the use of aI have such a powerful tool.

Justin Grammens  14:36  
Sure. Now, I guess as you were talking, I was thinking about machine learning, right? And who trains the machines, the humans train the machines? And so as you were saying, at the beginning, we're we are inherently biased as as humans, right, we are fearful of each other. There's a lot of history, I guess, right of racism and all that in our society. How Can we then train these machines to not be this way? Or do you do you think there's always going to be a piece of that?

Eri O'Diah  15:05  
I think there's there's to a certain level, there's always going to be a piece of that. I think that as long as the teams involved in building these machines are, are not diverse, or specifically, predominantly white male, we will continue to have this issue of algorithmic bias. It's a lack of diverse data, but you can't have diverse data when you don't have a diversity.

Justin Grammens  15:33  
Right? Right. For sure. So it starts with the team, it starts

Eri O'Diah  15:36  
with the team. Because if there's a dataset missing, if you have a diverse team, someone on your team might recognize that something that you likely as a white male may not, or me as a black woman might not. Right. There's a difference in experience from from a gender from an ethnic background perspective, right? And even from a religious perspective, right. From a faith based perspective, these data sets are often missing, and are not representative of society and predominately only represents Anglo European Society for the most part.

Justin Grammens  16:18  
Yes, yes, for sure. I mean, I had sent you an article, and maybe you'd seen it in Wired before.

Eri O'Diah  16:25  
Yeah, you sent it to me, I actually shared it on my LinkedIn. Good. Yeah.

Justin Grammens  16:29  
I mean, you know, it's, it talks about a researcher that was at Google who got fired. And this has come up a lot, I think, recently, and she wasn't just she was the first I think of a couple that have, yeah, and then a kind of uncovering the biases in the data, the biases, I guess, in the data that Google has access to, and just trying to shine a light on it. And, you know, I don't know which side of the debate you're on, per se. Are they right? In firing her? She claims that, you know, they just they they said she was going to leave, she was already on our way out. But, you know, she was very clearly sort of showing some gaps in what they're trying to do at Google. Large corporations still continue to have a say in this game, would you would you agree?

Eri O'Diah  17:13  
Yeah, absolutely. Agree. I'm I'm definitely Tim nuts. side, I think that what happened to her is indicative to what happens to most people of color, when they rang the alarm in situations like this, they are usually labeled as difficult not a team player, they're pushed out of companies and or fired. And there's some kind of justification around pushing them out. And not enough focus on the awareness that they're trying to bring to leadership, I think in two minutes situation with Google heart research have impacted their bottom line. And that was a problem. And when corporations put less or as equal focus on social impact, as they do their bottom line, or the dollar and their revenue, I don't think anything is going to change this kind of behavior by corporations will will continue.

Justin Grammens  18:16  
I don't know if there needs to be I hate to say yet another committee, per se. But I mean, how do you how do you get people to, to not follow the almighty dollar? You know, because there are companies that do do good because it's good. When

Eri O'Diah  18:28  
they're able to do that there's been a battle to get to that. Right. It's been a struggle to get to that place where the focus is to do good, I mean, base camp went through a similar issue where they disbanded their di committee because of the issue that was brought up. And then there was a falling out, a number of their leadership actually resigned, and protest of that, which I commend them. Because I mean, not a lot of people at corporations at that level would even resign and protest. It takes us as individuals to stand up straight their strength and numbers. We continue to meekly follow along. As injustices happen, how are we going to change anything? How are things going to change for people in the future, our children, this is the part of the problem. This is why racism as archaic as the idea of hating someone or disliking someone for the sheer pigment of their skin is. This is why over 400 plus years, we've sat on this issue. I mean, it's barbaric is archaic, like we well, I mean, seriously, we're going on 2022 We're still talking about race relations, for God's sakes. Oh,

Justin Grammens  19:46  
we got it. We we got to move the ball forward. And yeah, I think your company the fact that you're actually involved with MIT you said you know what was gave you I mean, does it does it start in this education area, this education Because it sounds like corporations aren't willing to do it. So is it through a nonprofit, through MIT through Stanford through all these other educational institutions, you think

Eri O'Diah  20:08  
we're definitely leaning on a collaboration between academia and industry, we're not going to be able to achieve to build a solution like said on our own, where we need to collaborate, and we're going to need the support of academia, we're going to need the support of corporations who deeply want to change their culture and improve the impact that their brand has on their consumers like truly change that and make a positive impact. Those companies that are committed to that we want to hear from them, we want to work with them.

Justin Grammens  20:49  
For sure. Well, what's what's a day in the life of a person in your role?

Eri O'Diah  20:54  
Oh, my God, the day in the life of a startup entrepreneur like myself, the intersection of under represented under estimated under funded is mainly work. I spend much of my time working, if I'm not working, I'm spending time with family or friends, or on a yoga mat somewhere. My hope is that next year we'll if we if we're able to when we meet our funding milestones, that I'll be able to step back a little bit and enjoy life a little bit. You know, they want to travel Do you think they know?

Justin Grammens  21:34  
Sure. All in time, though, I guess. So right now it's sort of nose to the grindstone and just barely, keeping keeping plugging away. You mentioned I mean, MIT, you went through a program there are you in? I mean, you're you're you're still sort of getting the framework pulled together. I guess right there. There isn't anything alive yet? No, yet?

Eri O'Diah  21:53  
No, there isn't anything like aI yet. We're still very much early stage, we're still very much research and development. Through MIT salts incubator new, we're able to hone in on two use cases. One of which I think I shared with you and works very similar to Grammarly. We call it the Grammarly of bias. And it's targeted anyone that primarily the marketing media and HR industry, and even consumers who are looking to to be more inclusive and mindful around their social media polls, emails and all of that. So there's a use case for communication. And we have a civic use case, which is in stealth mode right now.

Justin Grammens  22:42  
Well, exciting. Well, one of the things that I oftentimes think about is is how can we use artificial intelligence machine learning to change the future of work for humans? Oftentimes, people are like, I'm afraid the machines are gonna take over my job, right? I'm not what am I gonna do? And now all the computers are doing everything that I do. But it feels to me like actually, what you're building was said is, I mean, it's a hugely net positive, I guess, right? Is there? Is there any fear? It feels like it's a technology that is collaborative with what humans already do today? Is that safe to say?

Eri O'Diah  23:16  
Yes, that's safe to say, it's really just a solution to support what professionals are doing in HR space right now in the marketing space, and not necessarily replace them. I think, I think that, that once everyone gets comfortable with the idea that innately their bias, right, you know, and become more comfortable with that knowledge, they'd be more receptive to a solution, like said, And to see that it's not there to police your speech, or stripping of your freedom of speech, it's really there to assist you in being more mindful of who you're communicating with. Right. So that your bias doesn't have an adverse impact on that person. There's a certain level of I think, and we're seeing that now even in in in what happened recently with the young man that the the school shooting where his parents are being held liable, right. Up to up to a certain degree, we're going to, to start seeing some of our speech, we're going to be held accountable for our speech, right. The young woman in our lake that beauty, you know, hate High School. Yeah. School. Right. And apparently, although I don't believe she was raised to, to identify as an indigenous person, but you know, she's part of a tribe and referred to, to speak those words. It's shocking, but we're going to be held liable for that. Right? Well, if you're going to be held liable for hate speech, and the impact that your speech has on another human being. I think that a solution like Sid, is very much needed to help you be more mindful and more inclusive. Yeah,

Justin Grammens  25:13  
I think you're very smart with regards to attacking the, the the marketing side to begin with, you know, the communication and marketing side, because I mean, what business wouldn't want to market to a wider group of people properly? I mean, to me, it makes complete sense, it feels like you could actually, you know, have a pretty strong business case, to bring this tool into organization. Because, you know, now you opened up and you're actually actually speaking, the right, you're saying the right things, I guess, you know, you're you're not offending people. And so that's a good start, number one, but then number two, is you're getting your point across to a larger group of potential customers. So why would you not want to do that? I mean, do Do you get any sense of, of pushback at all, as you've gone through some of these, I guess some of these programs, some of these beta tests,

Eri O'Diah  26:03  
sure, much of the pushback is around security and privacy. A lot of potential customers are concerned about freedom of speech, and having their communication being policed by their employer. So so the pushback is around that for the most part. But I think that we can mitigate that as we're not recording your messages. Or helping you be more mindful of the language that you're using and how it could be perceived, understood, misunderstood. And what it actually means the English language is riddled with bias turns one, many of the American English language is steeped in slavery. And many people don't know that. I mean, we're debating whether or not the truth should be taught in schools. That's a whole nother conversation. Right. But many people don't actually know the history of this country, and what actually happened, and that's a problem. In Germany, they are very aware of what happened during the Holocaust, they are very aware of how the Nazi rose to power, they are very aware of that trustees that were inflicted on Jews, right, and they go to extent to ban any imagery around the Nazi Nazi ism. And here, we don't do that, which is why we have this cycle that continues to repeat itself in this country of hate and bias and bigotry, and sexism. It just keeps going on endlessly into 2022, which is supposed to be our future. We're in the future dealing with archaic mindsets and ideology. It's like, how are we supposed to function on it's like, it's like using das to like Ross.

Justin Grammens  28:04  
completely the wrong tool? It's been Yeah. Right, right. Yeah. Yeah, I'm sitting there writing my old BASIC programs here to try and to try and program machine learning. Yeah, it's just not gonna work.

Eri O'Diah  28:16  
It's not gonna work. Right. The sooner we come to terms with that, and move on, and change our strategy and approach, the better, it'll be for all of us.

Justin Grammens  28:29  
Yeah. For sure. Yeah. I mean, and you touched a little bit on history. And and I mean, I'm not sure if it's, because it's, I mean, Germany was what note in, you know, 1930 1940. And the United States is a lot longer than that. And I think we should have learned our lesson by now. It feels right. Or our

Eri O'Diah  28:51  
lesson, right? years ago. Whoa, failed to teach that lesson.

Justin Grammens  28:57  
I agree. Do you have a favorite book that that you've read? Just on any topic?

Eri O'Diah  29:03  
I don't have much time for leisure reading, unfortunately. But I do have a list of my top three that I want to read next year is is a 1619 project by Nicole Hannah Jones, quiet by Susan Cain, and Invisible Woman exposing data bias in the world of design, and a world designed by men by Caroline. I'm gonna butcher her last name here. But Perez is Corrado Preston. Sorry.

Justin Grammens  29:37  
Yeah, no, no, I'll be sure and find these books and put them in the notes below the or inside the podcast meeting notes. So it'll be great. It'll be great. I you know, as you were talking about bias, there's actually a book out there called a weapon called weapons of mass destruction. Yeah, heard of that. It's pretty good. And I don't have the physical book, but I listened to the audio one. And yeah, it's really good. read by the author, I forget her name. But you know, she touches on all these types of things where it's it's basically, you know, you kind of are looking for data that you want to find, right? And and so of course, you're going to set up models or set up data structures in such a way that it's going to prove what you actually hope it's going to prove, because because of bias, like you're saying, but these other books sound really interesting. I, you know, I'm the 6019 project, I don't know what that one's actually about.