Are the algorithms that power dating apps racially biased?

If the algorithms powering these match-making systems contain pre-existing biases, is the onus on dating apps to counteract them?
WIRED

A match. It’s a small word that hides a heap of judgements. In the world of online dating, it’s a good-looking face that pops out of an algorithm that’s been quietly sorting and weighing desire. But these algorithms aren’t as neutral as you might think. Like a search engine that parrots the racially prejudiced results back at the society that uses it, a match is tangled up in bias. Where should the line be drawn between “preference” and prejudice?

First, the facts. Racial bias is rife in online dating. Black people, for example, are ten times more likely to contact white people on dating sites than vice versa. In 2014, OKCupid found that black women and Asian men were likely to be rated substantially lower than other ethnic groups on its site, with Asian women and white men being the most likely to be rated highly by other users.

If these are pre-existing biases, is the onus on dating apps to counteract them? They certainly seem to learn from them. In a study published last year, researchers from Cornell University examined racial bias on the 25 highest grossing dating apps in the US. They found race frequently played a role in how matches were found. Nineteen of the apps requested users input their own race or ethnicity; 11 collected users’ preferred ethnicity in a potential partner, and 17 allowed users to filter others by ethnicity.

The proprietary nature of the algorithms underpinning these apps mean the exact maths behind matches are a closely guarded secret. For a dating service, the primary concern is making a successful match, whether or not that reflects societal biases. And yet the way these systems are built can ripple far, influencing who hooks up, in turn affecting the way we think about attractiveness.

“Because so much of collective intimate life starts on dating and hookup platforms, platforms wield unmatched structural power to shape who meets whom and how,” says Jevan Hutson, lead author on the Cornell paper.

For those apps that allow users to filter people of a certain race, one person’s predilection is another person’s discrimination. Don’t want to date an Asian man? Untick a box and people that identify within that group are booted from your search pool. Grindr, for example, gives users the option to filter by ethnicity. OKCupid similarly lets its users search by ethnicity, as well as a list of other categories, from height to education. Should apps allow this? Is it a realistic reflection of what we do internally when we scan a bar, or does it adopt the keyword-heavy approach of online porn, segmenting desire along ethnic search terms?

Filtering can have its benefits. One OKCupid user, who asked to remain anonymous, tells me that lots of men start conversations with her by saying she looks “exotic” or “unusual”, which gets old pretty quickly. “From time to time I turn off the ‘white’ option, because the app is overwhelmingly dominated by white men,” she says. “And it is overwhelmingly white men who ask me these questions or make these remarks.”

Even if outright filtering by ethnicity isn’t an option on a dating app, as is the case with Tinder and Bumble, the question of how racial bias creeps into the underlying algorithms remains. A spokesperson for Tinder told WIRED it does not collect data regarding users’ ethnicity or race. “Race has no role in our algorithm. We show you people that meet your gender, age and location preferences.” But the app is rumoured to measure its users in terms of relative attractiveness. By doing this, does it reinforce society-specific ideals of beauty, which remain prone to racial bias?

In 2016, an international beauty contest was judged by an artificial intelligence that had been trained on thousands of photos of women. Around 6,000 people from more than 100 countries then submitted photos, and the machine picked the most attractive. Of the 44 winners, nearly all were white. Only one winner had dark skin. The creators of this system had not told the AI to be racist, but because they fed it comparatively few examples of women with dark skin, it decided for itself that light skin was associated with beauty. Through their opaque algorithms, dating apps run a similar risk.

“A big motivation in the field of algorithmic fairness is to address biases that arise in particular societies,” says Matt Kusner, an associate professor of computer science at the University of Oxford. “One way to frame this question is: when is an automated system going to be biased because of the biases present in society?”

Kusner compares dating apps to the case of an algorithmic parole system, used in the US to gauge criminals’ likeliness of reoffending. It was exposed as being racist as it was much more likely to give a black person a high-risk score than a white person. Part of the issue was that it learnt from biases inherent in the US justice system. “With dating apps, we've seen people accepting and rejecting people because of race. So if you try to have an algorithm that takes those acceptances and rejections and tries to predict people’s preferences, it's definitely going to pick up these biases.”

But what’s insidious is how these choices are presented as a neutral reflection of attractiveness. “No design choice is neutral,” says Hutson. “Claims of neutrality from dating and hookup platforms ignore their role in shaping interpersonal interactions that can lead to systemic disadvantage.”

One US dating app, Coffee Meets Bagel, found itself at the centre of this debate in 2016. The app works by serving up users a single partner (a “bagel”) each day, which the algorithm has specifically plucked from its pool, based on what it thinks a user will find attractive. The controversy came when users reported being shown partners solely of the same race as themselves, even though they selected “no preference” when it came to partner ethnicity.

“Many users who say they have ‘no preference’ in ethnicity actually have a very clear preference in ethnicity [...] and the preference is often their own ethnicity,” the site’s cofounder Dawoon Kang told BuzzFeed at the time, explaining that Coffee Meets Bagel’s system used empirical data, suggesting people were attracted to their own ethnicity, to maximise its users’ “connection rate”. The app still exists, although the company did not answer a question about whether its system was still based on this assumption.

There’s an important tension here: between the openness that “no preference” suggests, and the conservative nature of an algorithm that wants to optimise your chances of getting a date. By prioritising connection rates, the system is saying that a successful future is the same as a successful past; that the status quo is what it needs to maintain in order to do its job. So should these systems instead counteract these biases, even if a lower connection rate is the end result?

Kusner suggests that dating apps need to think more carefully about what desire means, and come up with new ways of quantifying it. “The vast majority of people now believe that, when you enter a relationship, it's not because of race. It's because of other things. Do you share fundamental beliefs about how the world works? Do you enjoy the way the other person thinks about things? Do they do things that make you laugh and you don't know why? A dating app should really try to understand these things.”

Easier said than done, though. Race, gender, height, weight – these are (relatively) straightforward categories for an app to put into a box. Less easy is worldview, or sense of humour, or patterns of thought; slippery notions that might well underpin a true connection, but are often hard to define, even when an app has 800 pages of intimate knowledge about you.

Hutson agrees that “un-imaginative algorithms” are a problem, particularly when they’re based around questionable historical patterns such as racial “preference”. “Platforms could categorise users along entirely new and creative axes unassociated with race or ethnicity,” he suggests. “These new modes of identification may unburden historical relationships of bias and encourage connection across boundaries.”

Long before the internet, dating would have been tied to the bars you went to, the church or temple you worshipped at, the families and friends you socialised with on the weekends; all often bound to racial and economic biases. Online dating has done a lot to break barriers, but it has also carried on many outdated ways of thinking.

“My dating scene has been dominated by white men,” says the anonymous OKCupid user. “I work in a very white industry, I went to a very white university. Online dating has definitely helped me meet people I wouldn’t otherwise.”

There are signs that nudging users towards a wider range of ethnicities does have an impact. One 2013 analysis of OKCupid found that users from all racial backgrounds were equally likely to “cross a racial boundary” when reciprocating romantic contact, and those that replied to cross-race messages would go on to have more interracial exchanges. Other research suggests that online dating could increase rates of interracial marriage.

And dating apps have made efforts to change the way they deal with race. Last year, Grindr ran an anti-discrimination campaign called “Kindr”, after years of criticism that the service had become a home to outright racist behaviour. A spokesperson for the company said it had “taken several steps to foster a more inclusive and respectful community” and that it is “aware of a minority of users who may not act as inclusively as we would like while using the app”.

Squashing hateful language is one thing, considering how race permeates the data that underpins your app is another. Bias goes deep, and app makers need to decide how far they want to go in digging it up. At a time of political polarisation and social division, they need to think about how far they want to go in bringing people together, even if the system doesn’t necessarily think it would make a good match.

Want to know more about the future of love and relationships?

This article is part of our in-depth series investigating how technology is changing love, sex and relationships.

From keeping an intimate secret from the internet to the battle to destroy super gonorrhoea, we'll explore the technologies and ideas changing how we all live and love – for better or worse.

Click here to read more articles from this series.

– Inside the strange world of China's romantic video games

– Inside the disgustingly gloopy fight against super-gonorrhoea

– Why it's so difficult keeping your unborn child a secret from the internet

This article was originally published by WIRED UK