Online dating tech perpetuates racial bias and sexual racism

Google+ Pinterest LinkedIn Tumblr +

By Apryl Williams
Word In Black

When I first set up my Tinder profile a couple of years ago, I expected to find a match instantly. The pool of endless potential daters, literally at the touch of a button, suggested that friendship, romance and possibly love, were ever so near. 

Yet, the more I swiped, the more this hopeful feeling diminished. My matches seemed more interested in talking about my race than in developing connections. I soon found out that this is part of the emotional burden shared by millions of other Black women, like me, in the online dating world.

As a researcher working at the intersection of technology, race and society, my own online dating experience as a Black woman piqued my curiosity about the software powering these systems. Who or what decides what matches are presented to us? And are we presented with the same options as everyone else at the love buffet? 

Millions across the country use dating apps- but finding a match could prove to be a challenge for African Americans, as artificial intelligence can create a match list that reflects and reinforces Eurocentric beauty standards. (Photo: Unsplash/ Flure Bunny)

As I dug deeper into the experiences of multiple Black women and other people of color, I hit a brick wall of harsh reality: Online dating technology is fraught with racial bias, and in fact, capitalizes on sexual racism. 

Evidence from a patent filed by Match Group, the conglomerate which owns more than 40 dating platforms including Tinder, OkCupid, Match.com and others shows there is nothing random about the faces that appear on our screens. It’s all decided with algorithms and data.

In my recently published book, “Not My Type,” I write about how the dating platforms we use to find romance and companionship repeatedly fail women of color, and Black women in particular. Dating apps use scoring metrics and sorting algorithms underpinned by centuries of racial segregation, a failure to help vulnerable users report sexual racism; and more. 

Despite playing a critical role in how millions of people across the world choose their partners, dating platforms receive little to no scrutiny on how they go about making matches.

Tale as old as time, beauty and racist tech 

A decade ago, OkCupid published survey results about their users’ racial preferences on who they were likely to swipe right on and proceed to date. Think of this survey’s results as a version of hot-or-not. 

According to that survey, Black women were not hot — receiving the least engagement of all users of different races on the platform.

Contrary to the assumed promises of a tech utopia era that would rise above “seeing color,” dating app technology not only reinforces racist beauty standards but also builds a business model on racist foundations, like many other American corporations. While racism is part of our society, online technologies often increase the reach of societal inequity and automated systems help proliferate the damage. 

For example, when you upload your photos to a dating platform, your profile is filtered through several algorithms before it is shown to other users. These algorithms may evaluate your attractiveness in relation to other users. Based on perceived commonalities with the most popular users, algorithms likely decide who your profile illuminates. 

The thing is, the most popular users are often evaluated as highly attractive within a mainstream European aesthetic — high cheekbones, narrow nose, straight hair, lighter skin and ideal facial symmetry. This means that daters who do not look like White Instagram models will most likely be evaluated as less attractive. 

Dating websites’ algorithms that use facial recognition software might inadvertently look for the phenotypical expression of certain desired genes in a face because the programmers who design the algorithms do so with a limited understanding of normative desirability. If the scoring algorithm is an assessment of beauty, in which beauty is a racialized aesthetic, then only those within that frame will receive the highest score. 

Those furthest from the standard are likely given the lowest attractiveness scores, which also influence which matches they are presented with. By this logic, I hope to demonstrate how a certain standard of beauty is evaluated, reinforced and maintained through dating platforms whether they intend to or not. 

Newer technologies have the potential to worsen these racial dynamics. For example, the AI chatbots being integrated into these platforms are rife with racial bias: A study published in March by Cornell University found that OpenAI’s ChatGPT and Google’s Gemini AI described users of African American Vernacular English as “lazy” or “stupid.”

There is no shortage of AI models that completely fail the race test. For example, AI image generators depict people with lighter skin tones as CEOs, lawyers and other high-paying jobs, while those with darker skin tones were shown in lower-paying jobs like janitors, fast-food workers and housekeepers.

Despite these technological expansions of racism, it’s important to remember that humans still have agency in these systems; the algorithms that match us also consider user feedback. Each right or left swipe teaches the algorithm your preferences on who you think is hot or not. We have the agency to disrupt normative, mainstream patterns of beauty pushed forward by AI.

By breaking away from stereotypical labels of who we might like, we challenge mainstream perceptions of what’s presented to us as attractive. Your swipes help shape the matches recommended to you. I eventually found true love through Tinder, and I believe even more of us could, if the power of digital connection were to go hand in hand with racial justice. 

This article was originally published by Word in Black.

Source link

Share.

About Author