The Apple Card Didn't 'See' Gender
original event

The Apple credit card, launched in August, ran into major problems last week when users noticed that it seemed to offer smaller lines of credit to women than to men. The scandal spread on Twitter, with influential techies branding the Apple Card “fucking sexist,” “beyond f’ed up,” and so on. Even Apple’s amiable cofounder, Steve Wosniak, wondered, more politely, whether the card might harbor some misogynistic tendencies. It wasn’t long before a Wall Street regulator waded into the timeline of outrage, announcing that it would investigate how the card works to determine whether it breaches any financial rules.

The response from Apple just added confusion and suspicion. No one from the company seemed able to describe how the algorithm even worked, let alone justify its output. While Goldman Sachs, the issuing bank for the Apple Card, insisted right away that there isn't any gender bias in the algorithm, it failed to offer any proof. Then, finally, Goldman landed on what sounded like an ironclad defense: The algorithm, it said, has been vetted for potential bias by a third party; moreover, it doesn’t even use gender as an input. How could the bank discriminate if no one ever tells it which customers are women and which are men?

This explanation is doubly misleading. For one thing, it is entirely possible for algorithms to discriminate on gender, even when they are programmed to be “blind” to that variable. For another, imposing willful blindness to something as critical as gender only makes it harder for a company to detect, prevent, and reverse bias on exactly that variable.

The first point is more obvious. A gender-blind algorithm could end up biased against women as long as it’s drawing on any input or inputs that happen to correlate with gender. There’s ample research showing how such “proxies” can lead to unwanted biases in different algorithms. Studies have shown, for example, that creditworthiness can be predicted by something as simple as whether you use a Mac or a PC. But other variables, such as home address, can serve as a proxy for race. Similarly, where a person shops might conceivably overlap with information about their gender. The book Weapons of Math Destruction, by Cathy O’Neil, a former Wall Street quant, describes many situations where proxies have helped create horribly biased and unfair automated systems, not just in finance but also in education, criminal justice, and health care.

The idea that removing an input eliminates bias is “a very common and dangerous misconception,” says Rachel Thomas, a professor at the University of San Francisco and the cofounder of Fast.ai, a project that teaches people about AI.

This will only become a bigger headache for consumer companies as they become more reliant on algorithms to make critical decisions about customers—and as the public becomes more suspicious of the practice. We’ve seen Amazon pull an algorithm used in hiring due to gender bias, Google criticized for a racist autocomplete, and both IBM and Microsoft embarrassed by facial recognition algorithms that turned out to be better at recognizing men than women, and white people than those of other races.

Keep Reading

The latest on

artificial intelligence

, from machine learning to computer vision and more

What this means is algorithms need to be carefully audited to make sure bias hasn’t somehow crept in. Yes, Goldman said it did just that in last week’s statement. But the very fact that customers’ gender is not collected would make such an audit less effective. According to Thomas, companies must, in fact, “actively measure protected attributes like gender and race” to be sure their algorithms are not biased on them.

The Brookings Institution published a useful report in May on algorithmic bias detection and mitigation. It recommends examining the data fed to an algorithm as well as its output to check whether it treats, say, females differently from males, on average, or whether there are different error rates for men and women.

Please visit original link if the content is unavailable. This page is rendered by Context crawler for better reading experience, the content is intact.