AI & RoboticsNews

Women in AI: Whose voice should be loudest in the diversity convo?

The conversation about diversity in AI isn’t a one-and-done deal — as the number of applications for AI and machine learning pick up speed, and the reach of those products becomes more ubiquitous, centering diversity of thought in the industry remains critical. Fittingly, that conversation kicked off VB Transform 2022 at the Women in Data & AI Breakfast Panel, sponsored by Capital One.

“I believe data is the food for AI,” said panelist JoAnn Stonier, Chief Data Officer at Mastercard. “Having a diversity of thought in the data that we’re actually feeding AI and machine learning algorithms is super important. Having data scientists and other data professionals at the design sessions as we’re building products and solutions is super important if we want products that are going to reflect the society that we all want to live in.”

Stonier spoke to Mastercard’s data design principles, which highlight the baseline that employees need to aim for when designing with data. Besides privacy and security, accountability and transparency, sits the importance of innovation and integrity, which encloses the idea of making a positive social impact on the world — or designing with integrity. And now they’ve added inclusion to the list. 

“Inclusion means inclusive data sets of all the right kinds of data, inclusive algorithmic inquiry to get to the right type of inputs to create your machine learning algorithmic questions, and then to get the right answers about what you’re trying to solve,” she said. “Those three parts, and of course a lot of detail, go into all of that. They really create an inclusive methodology for our AI and for our data practices in general now at Mastercard.”

“We’re trying to tailor the best, most personalized, most helpful experience [for our customers]” said Molly Parr, VP, product, digital customer experiences, enterprise products & platforms at Capital One.

That means reflecting back the experience of an incredibly diverse customer base, and staying sensitive to inadvertent, harmful biases. 

“The best way to do that is still with humans,” Parr said. “Having that diversity of thought, that diversity of teams who represent the people that we’re talking to on the other side of these devices, is the best way. We’ve gotten machines to a great place where we can train them and they’re doing all of these things way faster, computationally more complex than anything. But the last step before it meets a customer has to be that inspectability and that human-trained angle to remove that bias.”

That responsible design approach is foundational to everything, Ya Xu VP of engineering, head of data and AI at LinkedIn said.

“It’s not just about checking if the data is balanced, if the algorithm is introducing something additional,” she said. “Do we have this responsible design concept to start with? Double-clicking on what JoAnn and Molly have said, it’s this human-centered approach. As we’re looking at how we’re building algorithms, and even thinking about how we’re evaluating, it’s so critical.”

You have to think about how this is impacting your customer, she added. With every feature launch at LinkedIn, the team evaluates not only business metrics, but the impact it might have across segments.

“Through that process of continuous monitoring, we’re making sure we’re not introducing these unintended consequences as we continue to evolve and improve our product,” she said. “That’s very ingrained in how we develop our products.”

The cost of not putting effort into increasing diversity in the AI space is large. Xu sees two outcomes: One is alienating a full half of your customers or members or users. The other is seeing AI become over-regulated in an attempt to correct bias without actually digging it out at the source. But regulation will directly impact innovation.

For Stonier, the big risk is getting it wrong. Nobody wants to get fraud wrong, for example. But there’s a million different ways that AI can be wrong — and not paying attention to bias is one of the easiest ways to get it wrong, and those faults have cascading consequences, big and small. Something as simple as AI-generated panda images, consistently biased by gender, ladders up to constant gender-coding, which boxes kids into very specific roles, and eventually into very specific careers and opportunities.

“If we don’t want that future, then we have to pull those kinds of proxy variables out of our thinking,” Stonier said. “We have to look at all of the subtle ways that global societies have inculcated gender and other biases into our language and into perhaps not our thought, but machine-coded thought. And so getting it wrong limits things.”

A call to action for everyone

To truly make an impact, the call for gender diversity in the industry needs to be a unified one, the panelists agreed. The tech industry is only about 20- to-25% women, so it’s a must to bring along male allies, Xu said, and a must that they are as passionate and energized by this as women. But she noted that male allies are often worried about making missteps.

“When they speak to it, sometimes they’re a little worried about how they are perceived,” she said. “But it’s important to know that everyone is going to get things wrong. Responsible AI and this space, it’s such a challenging thing to do. Women are going to say the wrong things, the same as men are going to say the wrong things. But we’re all here with the right intentions and we want to achieve the same things.”

Parr added, “Embedding inclusion into the dialogue, whatever it is — AI or anything else — is a huge component of getting it right. Doing that early and inclusively, whether that’s with regulators or with others in our companies or with our partners, that’s what we have to promote.” There’s still some inherent bias around who should contribute to those conversations and who should have the loudest voice, she adds.

“I like the call to action very much in terms of, what can we all be doing?” she said. “Even as women, going back to when we’re running the discussions with our teams, how do we get all of the voices out, so we get the absolute best design? And that’s embedded into the principles and the way that we build.”

Stonier notes that to attract more women into tech, a woman-led inclusion conversation can be powerful, but real diversity is where the most power lies.

“When I look at the team at Mastercard and think of all the different faces, I love that it’s all these different faces. Different races, different backgrounds, different ethnicities,” she said. “I encourage all of you to be talking about this and inviting your colleagues’ diverse thoughts to the table to have the conversation with you.”

Don’t miss the whole conversation, from regulatory barriers to actually starting the conversation in your organizations and more by registering for a free virtual general admission pass to Transform right here.


Author: VB Staff
Source: Venturebeat

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!