Say what now???
Have you ever been in a bathroom about to wash your hands but neither the water nor the automated soap dispenser will activate? Back in 2017, BOSSIP reported on the race problems that exist inside of the technology just like the one we mentioned above. Black people always seem to end up with the short end of the stick when it comes to how advanced software is programmed. No shock there, however, this latest round of racist tech has forced a major social media brand to issue a public mea culpa.
According to BBC, Facebook released a public apology after it was brought to their attention that a news video centered around Black men, prompted the algorithm to ask users if they would “like to see more videos about primates”. The inference is that if you watched a video about Black men then you must also want to see “other” monkeys, gorillas, and orangutans. No bueno to say the very least.
In a statement, Facebook called the offensive faux pas a “clearly unacceptable error” and followed up with, “We apologize to anyone who may have seen these offensive recommendations.”
You’d think that Zuckerberg and co. would have learned their lesson after one of their e-peers in Google took a similarly embarrassing L in 2015 when their search engine labeled Black people “gorillas”.
“We disabled the entire topic-recommendation feature as soon as we realized this was happening so we could investigate the cause and prevent this from happening again.”
“As we have said while we have made improvements to our AI, we know it’s not perfect and we have more progress to make.”
Last year, Facebook initiated an “inclusive product council” that was tasked with examining how these algorithms exhibit racial bias.
Maybe they’re on vacation…
*story by Bossip