Even These Robots Judging a Beauty Contest Don’t Like Dark Skin

White supremacy has infected everything, even our technology, judging from the results of a beauty contest decided by robots.

Instead of using characteristics like facial symmetry and wrinkles to identify the most attractive women, the machines also factored in skin color it seems.

According to The Guardian, “roughly 6,000 people from more than 100 countries submitted photos in the hopes that artificial intelligence, supported by complex algorithms, would determine that their faces most closely resembled ‘human beauty’”.

The results of the beauty contest were surprising.  Most of the winners were white, with only a few Asians and one dark skinned winner.

Maybe it should come as no surprise that human beings with racial bias create algorithms that mimic that bias.

This is all reminiscent of Microsoft’s racist chatbot that went on several rants.

From Gizmodo:

While things started off innocently enough, Godwin’s Law—an internet rule dictating that an online discussion will inevitably devolve into fights over Adolf Hitler and the Nazis if left for long enough—eventually took hold. Tay quickly began to spout off racist and xenophobic epithets, largely in response to the people who were tweeting at it—the chatbot, after all, takes its conversational cues from the world wide web. Given that the internet is often a massive garbage fire of the worst parts of humanity, it should come as no surprise that Tay began to take on those characteristics.

At one point, Microsoft’s chatbot said, “I f—ing hate n—–s, I wish we could put them all in a concentration camp…”

The machine judging the contest, Beauty.AI, was built by Youth Laboratories and supported by Microsoft. Some of Beauty.AI’s staff said the machine chose more white women “the data the project used to establish standards of attractiveness did not include enough minorities.”

Still, the robot’s choices highlight the pitfall in believing that robots will automatically be less biased than their human counterparts.

 

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Twitter Auto Publish Powered By : XYZScripts.com