September 08, 2021 1 min read 0 Comments
Facebooks most recent blunder is having users follow a mass exodus. A recent AI glitch asked viewers, who were watching a video featuring a black man, if they wanted to see additional "videos about primates".
Facebook of course after the incident was dubbed to be racist, and soon, the suggestion feature was of course removed. Facebook has announced that they are investigating the issue, but in the meantime have offered up an apology to the public. A representative for Facebook called the issue "unacceptable" and added: “As we have said, while we have made improvements to our A.I., we know it’s not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.”
"This 'keep seeing' prompt is unacceptable, @Facebook," Facebook employee Darci Groves recently tweeted. "And despite the video being more than a year old, a friend got this prompt yesterday. Friends at [Facebook], please escalate. This is egregious."
At what point are apologies no longer acceptable? Responsibility means investing in solutions for transformation. There are dedicated researchers out here that are willing to partner on this, as long as companies are willing to listen and lean into changehttps://t.co/DD03ttgnut— Ayanna Howard (@robotsmarts) September 6, 2021