3061: AI for All: How Lenovo is Tackling Bias and Accessibility in Tech
Tech Talks DailyOctober 19, 202419:2415.53 MB

3061: AI for All: How Lenovo is Tackling Bias and Accessibility in Tech

In this episode of Tech Talks Daily, I'm joined by Ada Lopez, Senior Manager of Lenovo's Product Diversity Office, recorded live at Lenovo Tech World in Seattle. Ada is on a mission to make AI more inclusive, accessible, and free of bias, and her passion for driving change through technology is truly inspiring.

We dive into the critical issue of AI bias and explore how algorithms, often unintentionally, reinforce gender, racial, and social biases. Ada breaks down real-world examples, from AI systems used in criminal justice to everyday applications like Google searches and loan approvals, illustrating how bias can creep into these systems. She also shares how Lenovo is actively working to mitigate AI bias through its product diversity office, ensuring that inclusivity and accessibility are built into the design process from the start.

Ada also sheds light on how Lenovo's AI initiatives are making a tangible difference, including their partnership with the Scott Morgan Foundation to help ALS patients preserve their voices, and the creation of tactile markings on ThinkPads to support users with visual impairments. We also discuss the excitement within Lenovo's development teams about building ethical AI systems and fostering empathy in product design.

This conversation is a must-listen for anyone interested in the future of AI, the challenges of ensuring fairness and accessibility in technology, and how major tech companies like Lenovo are leading the charge for responsible innovation. Join us as we explore the impact of AI on our lives today—and what's next for the technology of tomorrow.

[00:00:04] [SPEAKER_00]: Welcome back to the Tech Talks Daily Podcast. Today, I'm thrilled to be joined by Ada Lopez on the podcast. She's the Senior Manager of Lenovo's Product Diversity Office. And today, yeah, we're still here live at Lenovo's tech world here in Seattle. And I have to say, when I met Ada earlier, her energy, her passion and dedication to making a real difference and impact on the world with technology were just impossible to ignore.

[00:00:32] [SPEAKER_00]: And I think her enthusiasm for creating accessible and inclusive AI solutions that truly change people's lives was nothing short of inspiring. And I knew that was my true calling to ask her to come on the show and share that journey. And thankfully, she said yes.

[00:00:49] [SPEAKER_00]: And Ada is doing incredible work tackling AI bias, making technology more inclusive, and ensuring that Lenovo products are designed with everybody in mind. So enough from me, let's dive in.

[00:01:02] [SPEAKER_00]: Thanks for joining me today. Could you tell everyone listening a little about who you are and what you do?

[00:01:06] [SPEAKER_01]: Ada Lopez- It's really a pleasure to be here at Tech World. My name is Ada Lopez, and I have the pleasure of leading Lenovo's product diversity office.

[00:01:15] [SPEAKER_00]: Fantastic. And there's so much I want to talk with you about today. And there's a lot of hype around AI, as there is in every tech conference. But a subject I really wanted to dive deep in with yourself is AI bias. So just to set the scene, though, for people hearing about it for the first time,

[00:01:32] [SPEAKER_00]: can you give an example of AI bias? What is AI bias? What does it look like?

[00:01:36] [SPEAKER_01]: So when we think about AI bias, we need to consider situations in which we use AI. For example, if we use AI software to solve crime, and we consider how individuals are registered by the AI, what recommendations are made, and how some populations are more targeted than others. That's one example.

[00:01:59] [SPEAKER_01]: An example of AI bias can be as simple as something you're more familiar with, like a Google search. And you look up a keyword like captain, and then you find out, you get all these different names of men and no recommendations for women. Now, you can ask that same sort of question to an AI.

[00:02:35] [SPEAKER_01]: So just AI bias creeps in everyday life. And sometimes we don't even see it. And that's really the problem.

[00:02:51] [SPEAKER_01]: Because our students, our kids are growing up in this world of AI. And they don't always know how to vet the information that they're getting. So they ask questions about historical figures, and then they may only get the results of certain types of historical figures.

[00:03:10] [SPEAKER_01]: So when we develop our AI, we need to think about, are we asking our AI these hard questions that our students are going to be asking, that our customers are going to be asking? And what kind of output is this AI giving us?

[00:03:27] [SPEAKER_01]: So when we ask questions like, show me a teacher in the classroom, are they all going to be women? Show me a doctor, are they all going to be men? And we can go on and on.

[00:03:40] [SPEAKER_00]: Yeah, I think algorithms are everywhere now, whether it be the recruitment process or applying for a loan, that algorithm is going to determine yes or no, isn't it, ultimately?

[00:03:52] [SPEAKER_01]: And that's another place that we have seen AI vibes in applying for a loan. And if you think about how important that is, you're a first time home buyer, or you're trying to get a credit card for a special project, a dream vacation, whatever it may be, right?

[00:04:08] [SPEAKER_01]: A loan gives you an opportunity to have a great experience in your life. And if the AI is already biased, it's going to determine the course of your life and the quality of your life.

[00:04:20] [SPEAKER_01]: So absolutely, we need to think about what is going on with these algorithms? How are they being tested? Are the developers looking into these diversity questions and making sure that we are minimizing gender bias against specific ethnic groups?

[00:04:40] [SPEAKER_01]: There's a lot that we really need to consider, even age bias. When we talk about this bias, we really need to think about not just the algorithms going in, the operational data sources, we also need to consider how we're testing against it.

[00:04:57] [SPEAKER_01]: And then, after all that is said and done, we still have to consider how this AI is changing when it's out there in the wild.

[00:05:05] [SPEAKER_00]: Yeah, 100% with you. And as we are at Lenovo Tech World, a question I've got to ask is, how do you at Lenovo, especially with the Product Diversity Office, how do you ensure that diversity by design process integrates inclusive design principles throughout the product development lifecycle?

[00:05:23] [SPEAKER_00]: And are there any examples of how this approach has maybe impacted your work with Lenovo products, for example?

[00:05:29] [SPEAKER_01]: Absolutely. In recent months, we've seen a lot of development of new AI technologies. And that's really made us consider our process. So when we think about how we develop, for example, hardware, right?

[00:05:47] [SPEAKER_01]: We go ahead and we have an idea, we have a prototype, we test this idea, this prototype with users of different backgrounds, we get their input, right?

[00:05:59] [SPEAKER_01]: But now, AI is a little different. AI is a little different than developing hardware, because the background of the user may not impact the feedback that you get in the same way that when you're testing hardware, it would, right?

[00:06:20] [SPEAKER_01]: So now you have to consider other factors. And that's really, it's taken us as a team to sit down together and think about where are others getting caught? Where are we getting caught? And it's in that question asking, it's in that testing phase, right?

[00:06:35] [SPEAKER_01]: It's asking the right prompts and evaluating the outputs, asking really hard questions. And we already have some ideas, unfortunately, because we've already seen other examples of AI gone wrong. So we're like, okay, we already know that there's a diversity risk with, as I mentioned before, show me a school teacher versus show me a doctor versus show me a nurse and some of these professions, right?

[00:07:05] [SPEAKER_01]: Show me a nurse and some of these questions. And so now we have to see what's going on. So now we have to consider categories, our dimensions of diversity, and making sure that we are making questions against these dimensions of diversity, that test the AI and test the output. So that when we ship something out, we can say we have looked into some of these questions, and hopefully, catch AI.

[00:07:35] [SPEAKER_01]: And make fixes before it goes out into the wild.

[00:07:39] [SPEAKER_00]: Love that. And before you came on the podcast to join me today, I was doing a little research on you today. And one of the things that stood out to me was your background in education, and experience in managing both product and project teams. So how do you approach the challenge of educating development teams and mitigating AI buyers, and also fostering inclusivity in product design? There's a lot going on there, isn't there?

[00:08:02] [SPEAKER_01]: Okay, so that's really interesting.

[00:08:05] [SPEAKER_01]: Because in the product diversity office, we have really focused on inclusive design for our hardware, our software products. And we've had a lot of conversations with teams about changing the way that they do things and building in accessibility. And some of those conversations have been met with some resistance. It's hard to fill that knowledge gap. And it's hard to get our teams to buy in to doing something different when they don't understand it.

[00:09:02] [SPEAKER_01]: But AI is different.

[00:09:03] [SPEAKER_01]: We're already excited to learn. So we're okay. We have to make this process together. If you find questions that maybe our team didn't think about, share them with us. And I think a really important piece for this has been sharing best practices with other companies.

[00:09:20] [SPEAKER_01]: Lenovo signed a pledge with I'm going to say this is a French word, but I'm going to say it in English.

[00:09:27] [SPEAKER_01]: Circle Interels.

[00:09:29] [SPEAKER_00]: Circle Interels.

[00:09:30] [SPEAKER_01]: And this is a pledge to really minimize gender bias in AI. But what's so fantastic about it is that 16 companies that have signed this pledge, we get together and we exchange best practices.

[00:09:44] [SPEAKER_01]: So they tell us how they're approaching this human impact that AI has so that we can learn from our failures, so that we can learn from our successes together.

[00:09:57] [SPEAKER_01]: And the idea is that if we share this knowledge, we're going to build a better society, we're all going to build better products. And because we are in such early stages, there's again this enthusiasm, this willingness to share and this willingness to learn.

[00:10:15] [SPEAKER_00]: Wow. And I think that appetite for positive change and passion for driving is something we don't hear enough when everyone's distracted by the shiny side of the technology.

[00:10:25] [SPEAKER_00]: And a question I've got to ask you here, something I always ask my guests is, are there any myths or misconceptions that maybe surround AI bias?

[00:10:33] [SPEAKER_00]: There's probably a few things that you hear that may frustrate you that you hear again and again, stereotypes, et cetera.

[00:10:39] [SPEAKER_00]: But are there any myths and misconceptions around AI bias that frustrate you that we can maybe lay to rest today?

[00:10:45] [SPEAKER_01]: I think the most common myth around AI is the fear. There is a fear to use AI. There is not a fear. There is not a fear. The truth is, just like our development teams are excited to create this new technology, our consumers are really excited about trying this new technology.

[00:11:05] [SPEAKER_01]: We recently did a project with a local coffee shop, 321 Coffee. They employ individuals with cognitive disabilities.

[00:11:17] [SPEAKER_01]: And they had opened a new roasting facility and they wanted to train their roasters on how to go through the whole coffee roasting process, which, by the way, has a lot of steps.

[00:11:27] [SPEAKER_01]: And these employees had to rely a lot on their managers for these steps.

[00:11:32] [SPEAKER_01]: So Lenovo partnered with Applied AI Studio to create AI software that would help the employees go through the entire process where we put it on Lenovo tablets around the roasting facility.

[00:11:48] [SPEAKER_01]: And as the roasters would complete one part, the AI would prompt them to the next steps and so on.

[00:11:55] [SPEAKER_01]: And it was a really interesting experiment because there was enthusiasm around it.

[00:12:02] [SPEAKER_01]: But what happened was that it was actually so effective that at some point the employees didn't really need to rely so much on their manager.

[00:12:10] [SPEAKER_01]: We're like, this is such a win.

[00:12:11] [SPEAKER_01]: But then eventually they didn't need to rely on the AI either because they learned.

[00:12:15] [SPEAKER_01]: So we have these employees with cognitive disabilities that were really struggling to learn all of these steps.

[00:12:22] [SPEAKER_01]: And by using this process, they learned the steps.

[00:12:25] [SPEAKER_01]: That was not what we thought would happen.

[00:12:28] [SPEAKER_01]: It was an experiment, but that was the outcome.

[00:12:31] [SPEAKER_01]: And it was a great outcome.

[00:12:32] [SPEAKER_01]: And it's definitely something to celebrate.

[00:12:35] [SPEAKER_00]: And I would imagine that for business leaders listening that may be hearing about that fear side of it.

[00:12:40] [SPEAKER_00]: So we felt we've eradicated fear and replaced with enthusiasm for positive change is a great thing.

[00:12:45] [SPEAKER_00]: And for those business leaders listening, what advice would you give to other tech companies or indeed businesses tackling the issue of AI bias or thinking about it and ensuring that their data sources do not perpetuate that AI bias?

[00:12:59] [SPEAKER_00]: Any advice that you would share?

[00:13:00] [SPEAKER_01]: Absolutely.

[00:13:02] [SPEAKER_01]: It is so important when you're working with diversity and inclusion to build empathy.

[00:13:08] [SPEAKER_01]: And how do you build empathy?

[00:13:11] [SPEAKER_01]: Right.

[00:13:11] [SPEAKER_01]: And that's usually by having a test group, by meeting someone that has, if you're building a product for individual disabilities, work with somebody that has that lived experience.

[00:13:25] [SPEAKER_01]: Right.

[00:13:25] [SPEAKER_01]: There's so many people excited to try it.

[00:13:27] [SPEAKER_01]: If you see that there is some fear, then recruit and work in a small pilot with the people that are excited, that have that enthusiasm.

[00:13:35] [SPEAKER_01]: And that will spread and that will give you the outcomes that you're looking for.

[00:13:40] [SPEAKER_00]: And we have spoke a lot around AI bias today.

[00:13:44] [SPEAKER_00]: But before I let you go, another topic I'd just like to bring up is accessibility.

[00:13:47] [SPEAKER_00]: It is an ongoing challenge in the tech industry.

[00:13:50] [SPEAKER_00]: So what role does the Product Diversity Office play in ensuring that Lenovo products are accessible to a diverse range of people, including those with disabilities?

[00:14:00] [SPEAKER_00]: Are there any specific initiatives or recent developments that maybe we can shine a light on today?

[00:14:05] [SPEAKER_01]: Absolutely.

[00:14:06] [SPEAKER_01]: There's a lot to be said in that space.

[00:14:09] [SPEAKER_01]: In the keynote for tech world, you did hear about our partnership with the Scott Morgan Foundation and the work that we're doing with our partners to really come up with some solutions for individuals living with ALS.

[00:14:24] [SPEAKER_01]: And that's important because that doesn't just impact the individual, as you saw in the video.

[00:14:30] [SPEAKER_01]: That impacts their family, their loved ones, a community.

[00:14:33] [SPEAKER_01]: So this work is important.

[00:14:35] [SPEAKER_01]: A lot of times we ask, how big is the population?

[00:14:38] [SPEAKER_01]: How big is the population?

[00:14:38] [SPEAKER_01]: It's not just an individual that's going to use the technology.

[00:14:42] [SPEAKER_01]: It's how it impacts their community as a whole.

[00:14:46] [SPEAKER_01]: Lenovo traditionally is known for being a hardware company.

[00:14:50] [SPEAKER_01]: We have a partnership with a school for the blind.

[00:14:55] [SPEAKER_01]: And we put on a conference for these students.

[00:14:58] [SPEAKER_01]: And we went into this partnership thinking, oh, we're a big company.

[00:15:02] [SPEAKER_01]: This is a great way to give back to the community.

[00:15:05] [SPEAKER_01]: And it was really interesting because what we learned when we got there is that the students were really interested in giving to us.

[00:15:15] [SPEAKER_00]: Wow.

[00:15:15] [SPEAKER_01]: We thought, what exactly would you like to contribute?

[00:15:18] [SPEAKER_01]: They wanted to test our products.

[00:15:20] [SPEAKER_01]: They wanted their voices heard and they wanted to make an impact.

[00:15:24] [SPEAKER_01]: So we said, sure.

[00:15:25] [SPEAKER_01]: Here's our ThinkPad, our flagship product.

[00:15:28] [SPEAKER_01]: How would you make this more accessible?

[00:15:30] [SPEAKER_01]: And they said, we use JAWS, which is a screen reader day to day.

[00:15:35] [SPEAKER_01]: And as soon as we open our laptop, JAWS starts talking to us.

[00:15:39] [SPEAKER_01]: And we can't control the volume.

[00:15:41] [SPEAKER_01]: We could be in a public space.

[00:15:42] [SPEAKER_01]: We need to find out where the volume up and down keys are fast.

[00:15:46] [SPEAKER_01]: We need this quickly.

[00:15:47] [SPEAKER_01]: So we need a tactile marking on those buttons.

[00:15:51] [SPEAKER_01]: And we need to know the difference between one and the other.

[00:15:53] [SPEAKER_01]: And we don't want it to be something like a sticker that could fall off or you have to rely on a sighted person to put on.

[00:16:01] [SPEAKER_01]: We want something on the hardware that will not move.

[00:16:04] [SPEAKER_01]: And we did exactly that.

[00:16:06] [SPEAKER_01]: And now our ThinkPads have tactile markings on the keys that the community needed,

[00:16:12] [SPEAKER_01]: that the community of disabled or visually impaired students, they asked for these markings.

[00:16:18] [SPEAKER_01]: We tested the markings with the school.

[00:16:21] [SPEAKER_01]: We tested this marking also with adults in the community that are visually impaired.

[00:16:26] [SPEAKER_01]: But we also tested the markings with sighted users to make sure that they weren't distracting or taking away from the product.

[00:16:34] [SPEAKER_01]: And those markings are now in our ThinkPads.

[00:16:37] [SPEAKER_01]: And the reason that testing these markings with both visually impaired and sighted individuals is important

[00:16:46] [SPEAKER_01]: is because if we're really going to make a difference in the space of accessibility, it has to work for all.

[00:16:52] [SPEAKER_01]: It has to be smarter technology for all.

[00:16:54] [SPEAKER_01]: It cannot just be something different for some.

[00:16:59] [SPEAKER_01]: It needs to be something that's going to work for everyone.

[00:17:02] [SPEAKER_00]: I think it's so important, the Lenovo Tech World event, it began with that powerful message.

[00:17:08] [SPEAKER_00]: It wasn't tagged on at the end.

[00:17:10] [SPEAKER_00]: It was front and center of the event.

[00:17:12] [SPEAKER_00]: And I think that made such a difference.

[00:17:14] [SPEAKER_00]: And just thank you for also shining a light on the great work you're doing today.

[00:17:17] [SPEAKER_00]: Really appreciate your time.

[00:17:19] [SPEAKER_01]: Thank you so much.

[00:17:20] [SPEAKER_01]: I appreciate you.

[00:17:21] [SPEAKER_00]: I'm so grateful to Ava for stopping by and having a quick conversation with her there.

[00:17:25] [SPEAKER_00]: And we did cover so much from the challenges of AI bias to the innovative ways that Lenovo Product Diversity Office

[00:17:32] [SPEAKER_00]: is designing inclusive products that actually reflect the real world diversity of its users.

[00:17:39] [SPEAKER_00]: And I often say on this podcast, diversity of thought and ensuring that an organization is as diverse as the audience it serves is critical.

[00:17:48] [SPEAKER_00]: And I think Ada's examples today from AI helping individuals with ALS to making ThinkPads more accessible to those with visual impairments,

[00:17:57] [SPEAKER_00]: these few examples really highlight the power that technology has in bridging the gaps and making a meaningful impact in people's lives.

[00:18:05] [SPEAKER_00]: And as we reflect on the role of AI in shaping the future, I think it's clear that companies like Lenovo are at the forefront of ensuring technology works for everyone,

[00:18:15] [SPEAKER_00]: not just the select few, a real timely reminder there.

[00:18:19] [SPEAKER_00]: But I'd love to hear your thoughts.

[00:18:21] [SPEAKER_00]: How do you think your business can address, yes, AI bias, but also ensure inclusivity in your products and services?

[00:18:30] [SPEAKER_00]: As always, feel free to reach out, share your ideas.

[00:18:34] [SPEAKER_00]: This isn't a monologue.

[00:18:35] [SPEAKER_00]: It is a dialogue.

[00:18:36] [SPEAKER_00]: I encourage you all, whether you're in tech, whether you're in business,

[00:18:39] [SPEAKER_00]: whether you are outside of those things,

[00:18:41] [SPEAKER_00]: and some of the topics we've explored today just resonate with you.

[00:18:44] [SPEAKER_00]: Your thoughts, your insights, your opinion matter so much,

[00:18:47] [SPEAKER_00]: and I would love to amplify your voice far and wide.

[00:18:51] [SPEAKER_00]: So as always, email me, techblogwriteroutrook.com,

[00:18:54] [SPEAKER_00]: Twitter, LinkedIn, Instagram, just at Neil C. Hughes.

[00:18:57] [SPEAKER_00]: So a big thank you to Ada for her time,

[00:19:00] [SPEAKER_00]: shining a light on the critical work that's being done at Lenovo.

[00:19:04] [SPEAKER_00]: And for everybody else, thank you for listening,

[00:19:06] [SPEAKER_00]: and I cordially invite you to join me again tomorrow, where we will do it all again.

[00:19:12] [SPEAKER_00]: So hopefully I will speak with you all again bright and early tomorrow.

[00:19:15] [SPEAKER_00]: Bye for now.