andy kelly 0E vhMVqL9g unsplash 5723d

The Best Way To Avoid A Sexist Robot Takeover? Increase Diversity In Tech.

by Sophie Hayssen

Over the past two decades, robots and machines have started to play an increasingly important, and sentient, role in our lives. We rely on them to plan our schedules, make our food, and clean our houses. But while the malevolent robot is often depicted as a world-domination-obsessed action movie villain, the potential evil of Artificial Intelligence (AI) could actually be an all too familiar: prejudice.


A recent example is the Apple Card, which, three months after its release, is being accused of sexism. The controversy came to light after David Heinemeier Hansson, a programmer famous for developing Ruby on Rails, took to Twitter to say that his Apple Card credit limit was 20x greater than his wife’s for seemingly no reason other than gender. The statement inspired others to report similar discrepancies including Apple co-founder Steve Wozniack, who said that his credit limit was 10x higher than his wife’s. In multiple cases, users reported their wives were, by every metric, better candidates for higher credit than they were.


In response to Hansson’s tweets, Apple’s VIP customer service team matched Hansson’s wife’s credit score to her husband’s and launched an internal investigation. The tweet also alerted New York’s Department of Financial Services, which said that such discrimination “violates New York law.”

In an essay in Fast Company, Jamie Heinemeier Hansson explained that she had a very successful career and financial independence before ever meeting her husband, though she declined to say what this career entailed. Hansson also noted how the response following her husband’s complaint was largely predicated on the couple’s position of privilege. Acknowledging that having a lower credit score than her husband doesn’t gravely impact her livelihood as a wealthy woman, she wrote, “This is not merely a story about sexism and credit algorithm black boxes, but about how rich people nearly always get their way. Justice for another rich white woman is not justice at all.”

Hansson is right. The issue of the Apple Card’s discrimination is much bigger than just her. Apple Card is run by Goldman Sachs, who released a statement saying that people are valued individually, arguing that this means “it is possible for two family members to receive significantly different credit decisions. In all cases, we have not and will not make decisions based on factors like gender.” As writer Kate Cox points out in Ars Technica, Goldman Sachs and the Apple Card may not be going blatantly out of their way to discriminate on women, but that sexism might instead be built into the ostensibly neutral algorithm which is used to determine how much credit Apple Card users get.

Leo Kelion, the technology desk editor at the BBC, said that the algorithm’s flawed decision-making could have been originated in the initial data it was basing them on. For instance, if the algorithm was fed data where women were generally more financially riskier than men, it could generalize that trend, applying it, incorrectly, to all women.

This learned prejudice has become fairly commonplace in AI, and as this technology is pervading nearly every area of life, from hiring to prison sentencing, the consequences are getting more drastic. Amazon stopped using AI in their hiring process when it was discovered that it was discriminatory against women. AI also has the power to not just learn words, but associations. These associations range from harmless (e.g. smiles are good; frowns are bad) too deeply problematic. The Medium publication Towards Data Science reports that names with European origins “were significantly more easily associated with pleasant than unpleasant terms, compared to some African American names.”

This tendency of AI was recently put on display in the facial recognition app ImageNet Roulette, which showed what labels AI associated with a face. To illustrate how the app worked, NYLON writer Allison Stubblebine used an example of a photograph of Kacey Musgraves that the algorithm labeled “enchantress, witch.” Again, these associations start off as funny, if not downright entertaining but become much grimmer once race is involved. However, as NYLON reported, when Julia Carrie Wong, a senior technology reporter at The Guardian, used the app, it labeled her photo with anti-Asian racial epithets.


Maurizio Santamicone describes in “Is Artificial Intelligence Racist?” a key way to eliminate bias in AI is to change the people who are working to develop it, and for AI developers to take de-biasing precautions. For Santamicone, the diversity needed in AI isn’t just based on identity but style of thinking. If algorithms are trained exclusively by scientists to think rigidly within a data set, they will inevitably develop prejudice. Instead, as deep learning researcher Rachel Thomas said, “We need more experts about people, like human psychology, behavior and history. AI needs more unlikely people.”

As AI continues on its seemingly inevitable transformation into eerie sentience, this call for diversity signals an important, almost clichéd lesson. When we are able to look at people as more than data sets, but fully complex and individualized human beings, we are able to tap into the best parts of humanity—compassion, sympathy, respect, and curiosity. It should be imperative to all AI developers that algorithms are trained in those elements of humanity as well.

Photo by  Andy Kelly  on Unsplash

More from BUST:

Period Apps Are Sharing More Than Just Your Cycle With Facebook

This Fertility App Is Using Aesthetics To Trick You Out Of Birth Control

Five Middle School Girls Won These Top STEM Prizes — For The First Time

You may also like

Get the print magazine.

The best of BUST in your inbox!

Subscribe to Our Weekly Newsletter

About Us

Founded in 1993, BUST is the inclusive feminist lifestyle trailblazer offering a unique mix of humor, female-focused entertainment, uncensored personal stories, and candid reporting that tells the truth about women’s lives.

©2023 Street Media LLC.  All Right Reserved.