Google Photo Algorithm Mislabels African Americans As 'Gorillas'

Google Photo's Most Recent Epic Fail Is Horrendously Racist

When Google fails, the descent is epic .

Its most recent cyber faux pas is centred around Google Photo's reportedly faulty algorithm that has made the tech giant looking astonishingly racist.

According to Yahoo Tech Jacky Alcine a 21-year old New Yorker, accidentally discovered that his Google Photos account had generated a folder titled 'Gorillas', which to his surprise contained only images of him and his friend from 2013 -- no gorillas.

Keen to investigate, he tweeted Google's chief architect of social Yonatan Zunger, who responded:

"Sheesh" barely scratches the surface of an appropriate response.

A Google representative told Yahoo Tech:

"We’re appalled and genuinely sorry that this happened.

“We are taking immediate action to prevent this type of result from appearing.

"There is still clearly a lot of work to do with automatic image labelling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”

For the most part, Google Photo's image recognition does a good job of grouping people's images accurately, so the gaff did make Jacky wonder what data the algorithm is using to classify African Americans.

This is not the first time Google has come under fire. In May, it had to apologise over how 'Map' results ranked The White House when racial slurs including "nigger house" were searched.

Open Image Modal

Google is yet to provide an official explanation for how its Photo service was able to make such a mistake.

However, Twitter true to its nature, had a copious supply of (highly inappropriate) reasons: