Posts in Bias
Who Controls Our Algorithmic Future? - Datanami

Alex Woodie

The accelerating pace of digitization is bringing real, tangible benefits to our society and economy, which we cover daily in the pages on this site. But increased reliance on machine learning algorithms brings its own unique set of risks that threaten to unwind progress and turn people against one another. Three speakers at last week’s Strata Data Conference in New York put in all in perspective.

Read More
Start-Ups Use Technology to Redesign the Hiring Process - NY Times

Iris Bohnet, a behavioral economist and professor at the Harvard Kennedy School, spoke to the founders of two behavioral design start-ups, Kate Glazebrook of Applied and Frida Polli of Pymetrics, for the latest on the algorithmic design revolution that is transforming hiring practices.

Read More
Women in the Workplace 2017 - LeanIn.Org and McKinsey

More companies are committing to gender equality. But progress will remain slow unless we confront blind spots on diversity—particularly regarding women of color, and employee perceptions of the status quo.

Women remain underrepresented at every level in corporate America, despite earning more college degrees than men for 30 years and counting. There is a pressing need to do more, and most organizations realize this: company commitment to gender diversity is at an all-time high for the third year in a row.

Despite this commitment, progress continues to be too slow—and may even be stalling. Women in the Workplace 2017, a study conducted by LeanIn.Org and McKinsey, looks more deeply at why, drawing on data from 222 companies employing more than 12 million people, as well as on a survey of over 70,000 employees and a series of qualitative interviews. One of the most powerful reasons for the lack of progress is a simple one: we have blind spots when it comes to diversity, and we can’t solve problems that we don’t see or understand clearly.

Read More
Here's why gender equality is taking so long - World Economic Forum

The World Economic Forum estimates gender parity globally may now be over 170 years away. Previously they estimated an 80-year time, then it was 120 years. It keeps slowing down. The Forum's Annual Gender Gap Report shows slow progress and minimal change in many countries worldwide. What is causing this glacial pace of change, something the airline industry calls a “creeping delay”?

There are many headwinds that can lengthen the time required for desired systemic change, but there is one I’d like to address here, head on, and it’s this: unconscious bias.

In general, there is a lack of awareness about who others are and what their capabilities and inherent qualities may be. In corporations, this often manifests as a culture that is unfriendly or unhelpful to women.

Read More
Artificial Intelligence: Making AI in our Images - Savage Mind

Savage Minds welcomes guest blogger Sally Applin

Hello! I’m Sally Applin. I am a technology anthropologist who examines automation, algorithms and Artificial Intelligence (AI) in the context of preserving human agency. My dissertation focused on small independent fringe new technology makers in Silicon Valley, what they are making, and most critically, how the adoption of the outcomes of their efforts impact society and culture locally, and/or globally. I’m currently spending the summer in a corporate AI Research Group where I contribute to anthropological research on AI. I’m thrilled to blog for the renowned Savage Minds this month and hope many of you find value in my contributions.

Read More
Debiasing AI Systems- Luminoso Blog

One of the most-discussed topics in AI recently has been the growing realization that AI-based systems absorb human biases and prejudices from training data. While this has only recently become a hot news topic, AI organizations, including Luminoso, have been focused on this issue for a while. Denise Christie sat down with Luminoso’s Chief Science Officer, Rob Speer, to talk about how AI becomes biased in the first place, the impact such bias can have, and - more importantly - how to mitigate it.

Read More
Words ascribed to female economists: 'Hotter,' 'feminazi.' Men?: 'Goals,' 'Nobel.' - The Washington Post

In 1970, the economics department at the University of California at Berkeley hired three newly minted economics PhDs from the Massachusetts Institute of Technology. Two - both men - were hired as assistant professors. But a woman, Myra Strober, was hired as a lecturer, a position of inferior pay and status and no possibility of tenure. When she asked the department chairman why she was denied an assistant professorship, he put her off with excuses. She kept pressing him until he gave a frank answer: She had two young children; the department couldn't possibly put her on the tenure track.

So Strober took another offer. In 1972, she became the first female economist at Stanford's Graduate School of Business. "They didn't know what to make of me," she said. The faculty retreat, which had been held every year at a men's club, had to be moved. There were jokes about putting a bag over her head so they could keep going to the club.

"It was like trying to run a race with one of your legs tied behind you," Strober said of the culture.

Read More
AI May Soon Replace Even the Most Elite Consultants - HBR

mazon’s Alexa just got a new job. In addition to her other 15,000 skills like playing music and telling knock-knock jokes, she can now also answer economic questions for clients of the Swiss global financial services company, UBS Group AG.

According to the Wall Street Journal (WSJ), a new partnership between UBS Wealth Management and Amazon allows some of UBS’s European wealth-management clients to ask Alexa certain financial and economic questions. Alexa will then answer their queries with the information provided by UBS’s chief investment office without even having to pick up the phone or visit a website. And this is likely just Alexa’s first step into offering business services. Soon she will probably be booking appointments, analyzing markets, maybe even buying and selling stocks. While the financial services industry has already begun the shift from active management to passive management, artificial intelligence will move the market even further, to management by smart machines, as in the case of Blackrock, which is rolling computer-driven algorithms and models into more traditional actively-managed funds.

Read More
Machines trained on photos learn to be sexist towards women - Wired

Last Autumn, University of Virginia computer-science professor Vicente Ordóñez noticed a pattern in some of the guesses made by image-recognition software he was building. “It would see a picture of a kitchen and more often than not associate it with women, not men,” he says.

That got Ordóñez wondering whether he and other researchers were unconsciously injecting biases into their software. So he teamed up with colleagues to test two large collections of labeled photos used to “train” image-recognition software.

Their results are illuminating. Two prominent research-image collections—including one supported by Microsoft and Facebook—display a predictable gender bias in their depiction of activities such as cooking and sports. Images of shopping and washing are linked to women, for example, while coaching and shooting are tied to men. Machine-learning software trained on the datasets didn’t just mirror those biases, it amplified them. If a photo set generally associated women with cooking, software trained by studying those photos and their labels created an even stronger association.

Mark Yatskar, a researcher at the Allen Institute for Artificial Intelligence, says that phenomenon could also amplify other biases in data, for example related to race. “This could work to not only reinforce existing social biases but actually make them worse,” says Yatskar, who worked with Ordóñez and others on the project while at the University of Washington.

Read More
Alexa, Siri, Cortana: Our virtual assistants say a lot about sexism -Science Friction

OK, Google. We need to talk. 

For that matter — Alexa, Siri, Cortana — we should too.

The tech world's growing legion of virtual assistants added another to its ranks last month, with the launch of Google Home in Australia.

And like its predecessors, the device speaks in dulcet tones and with a woman's voice. She sits on your kitchen table — discreet, rotund and white — at your beck and call and ready to respond to your questions.

But what's with all the obsequious, subservient small talk? And why do nearly all digital assistants and chatbots default to being female?

A handmaid's tale

Feminist researcher and digital media scholar Miriam Sweeney, from the University of Alabama, believes the fact that virtual agents are overwhelmingly represented as women is not accidental.

"It definitely corresponds to the kinds of tasks they carry out," she says.

Read More
How Silicon Valley's sexism affects your life - Washington Post

It was a rough week at Google. On Aug. 4, a 10-page memo titled "Google's Ideological Echo Chamber" started circulating among employees. It argued that the disparities between men and women in tech and leadership roles were rooted in biology, not bias. On Monday, James Damore, the software engineer who wrote it, was fired; he then filed a labor complaint to contest his dismissal.

We've heard lots about Silicon Valley's toxic culture this summer - venture capitalists who proposition female start-up founders, man-child CEOs like Uber's Travis Kalanick, abusive nondisparagement agreements that prevent harassment victims from describing their experiences. Damore's memo added fuel to the fire, arguing that women are more neurotic and less stress-tolerant than men, less likely to pursue status, and less interested in the "systemizing" work of programming. "We need to stop assuming that gender gaps imply sexism," he concludes.

Like the stories that came before it, coverage of this memo has focused on how a sexist tech culture harms people in the industry - the women and people of color who've been patronized, passed over, and pushed out. But what happens in Silicon Valley doesn't stay in Silicon Valley. It comes into our homes and onto our screens, affecting all of us who use technology, not just those who make it.

Read More
We tested bots like Siri and Alexa to see who would stand up to sexual harassment -Quartz

Women have been made into servants once again. Except this time, they’re digital.

Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana, and Google’s Google Home peddle stereotypes of female subservience—which puts their “progressive” parent companies in a moral predicament.

People often comment on the sexism inherent in these subservient bots’ female voices, but few have considered the real-life implications of the devices’ lackluster responses to sexual harassment. By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated. Everyone has an ethical imperative to help prevent abuse, but companies producing digital female servants warrant extra scrutiny, especially if they can unintentionally reinforce their abusers’ actions as normal or acceptable.

Read More
Why we desperately need women to design AI - Medium

At the moment, only about 12–15% of the engineers who are building the internet and its software are women.

Here are a couple examples that illustrate why this is such a big a problem:

  • Do you remember when Apple released it’s health app a few years ago? Its purpose was to offer a ‘comprehensive’ access point to health information and data. But it left out a large health issue that almost all women deal with, and then took a year to fix that hole.
  • Then there was that frustrated middle school-aged girl who enjoyed gaming, but couldn’t find an avatar she related to. So she analyzed 50 popular games and found that 98% of them had male avatars (mostly free!), and only 46% of them had female avatars (mostly available for a charge!). Even more askew when you consider that almost half of gamers are women.

We don’t want a repeat of these kinds of situations. And we’ve been working to address this at Women 2.0 for over a decade. We think a lot about how diversity — or lack thereof. We think about it has affected — and is going to affect — the technology outputs that enter our lives. These technologoies engage with us. The determine our behaviors, thought processes, buying patterns, world views… you name it. This is part of the reason we recently launched Lane, a recruitment platform for female technologists.

Read More
Look Who’s Still Talking the Most in Movies: White Men -New York Times

With “Wonder Woman” and “Girls Trip” riding a wave of critical and commercial success at the box office this summer, it can be tempting to think that diversity in Hollywood is on an upswing.

But these high-profile examples are not a sign of greater representation in films over all. A new study from the University of Southern California’s Viterbi School of Engineering found that films were likely to contain fewer women and minority characters than white men, and when they did appear, these characters were portrayed in ways that reinforced stereotypes. And female characters, in particular, were generally less central to the plot.

Read More
Machine Bias: There’s software used across the country to predict future criminals. And it’s biased against blacks.

ON A SPRING AFTERNOON IN 2014, Brisha Borden was running late to pick up her god-sister from school when she spotted an unlocked kid’s blue Huffy bicycle and a silver Razor scooter. Borden and a friend grabbed the bike and scooter and tried to ride them down the street in the Fort Lauderdale suburb of Coral Springs.

Just as the 18-year-old girls were realizing they were too big for the tiny conveyances — which belonged to a 6-year-old boy — a woman came running after them saying, “That’s my kid’s stuff.” Borden and her friend immediately dropped the bike and scooter and walked away.

But it was too late — a neighbor who witnessed the heist had already called the police. Borden and her friend were arrested and charged with burglary and petty theft for the items, which were valued at a total of $80.

Compare their crime with a similar one: The previous summer, 41-year-old Vernon Prater was picked up for shoplifting $86.35 worth of tools from a nearby Home Depot store.

Prater was the more seasoned criminal. He had already been convicted of armed robbery and attempted armed robbery, for which he served five years in prison, in addition to another armed robbery charge. Borden had a record, too, but it was for misdemeanors committed when she was a juvenile.

Yet something odd happened when Borden and Prater were booked into jail: A computer program spat out a score predicting the likelihood of each committing a future crime. Borden — who is black — was rated a high risk. Prater — who is white — was rated a low risk.

Read More