Google Launches Four Steps To Tackle Online Terrorism

Google Launches Four Steps To Tackle Online Terrorism
|

Google has ramped up its efforts to tackle online terrorism with the introduction of four new steps to address the problem.

The internet giant acknowledged that the threat poses a serious challenge and more immediate action needs to be taken.

Google pledged four additional steps in the fight against online terrorism - better detection of extremist content and faster review, more experts, tougher standards, and early Intervention and expanding counter-extremism work.

In a blogpost, Kent Walker, senior vice president and general counsel at Google, said: "Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all.

"Google and YouTube are committed to being part of the solution. We are working with government, law enforcement and civil society groups to tackle the problem of violent extremism online.

"There should be no place for terrorist content on our services.

"While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now."

Google's engineers have developed technology to prevent re-uploads of known terrorist content using image-matching techniques.

:: Better detection of extremist content

Google will devote more engineering resources to apply its most advanced machine learning research to train new content classifiers to help identify and remove extremist and terrorism-related content more quickly.

:: More experts

Google will increase the number of independent experts in YouTube's Trusted Flagger programme.

It will expand this programme by adding 50 expert NGOs that it will support with operational grants.

It will also expand its work with counter-extremist groups to help identify content that may be being used to radicalise and recruit extremists.

:: Tougher standards

The company will take a tougher stance on videos that do not clearly violate its policies.

In the future, videos such that contain inflammatory religious or supremacist content will appear behind an interstitial warning and they will not be monetised, recommended or eligible for comments or user endorsements.

:: Early Intervention and expanding counter-extremism work

Google-owned Youtube will expand its role in counter-radicalisation efforts. Its approach targets online advertising to reach potential Islamic State recruits, and redirects them towards anti-terrorist videos that can change their minds about joining.

Mr Walker said: "Collectively, these changes will make a difference. And we'll keep working on the problem until we get the balance right.

"Extremists and terrorists seek to attack and erode not just our security, but also our values, the very things that make our societies open and free. We must not let them.

"Together, we can build lasting solutions that address the threats to our security and our freedoms. It is a sweeping and complex challenge. We are committed to playing our part."

Google is also working together with Facebook, Microsoft and Twitter to establish an international forum to share and develop technology and support smaller companies and accelerate our joint efforts to tackle terrorism online.

Labour MP Yvette Cooper welcomed the pledges.

She said: "This is a very welcome step forward from Google after the Home Affairs Select Committee called on them to take more responsibility for searching for illegal content.

"The Select Committee recommended that they should be more proactive in searching for - and taking down - illegal and extremist content, and to invest more of their profits in moderation.

"News that Google will now proactively scan content and fund the trusted flaggers who were helping to moderate their own site is therefore important and welcome though there is still more to do."