Tech giant Google announced in a post on its blog on Sunday that it will take strong steps to implement severe measures to battle terrorism and violence by singling out and removing all extremist and terrorist-related content from its popular video sharing platform, Youtube. This announced comes after approximately 1 month after the tragic terrorist incident at Ariana Grade‘s concert in Manchester, England where 22 people were killed and 120 people were injured by a suicide bomber, and a couple of weeks after the shocking terrorist incident that took place on London Bridge where 3 terrorists killed 8 people using knives and a van.
Google’s general counsel Kent Walker said, ” Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all. Google and YouTube are committed to being part of the solution. We are working with government, law enforcement and civil society groups to tackle the problem of violent extremism online. There should be no place for terrorist content on our services.” [...] ” While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now”.
It is evident that Google is adamant to find an effective solution to fight terrorism and limit hate speech that is circulating on social media owing to the release of extremist and fanatic videos uploaded mostly on Youtube. In this way, internet users would be able to have a safe and peaceful internet environment. The company has a strategic program to carry out its counter-extremist agenda.
Google are fortifying their efforts to rely more on technology to analyse and filter out any extremist and terrorist videos on Youtube to be permanently removed from their platform. According to the company, extremism and terrorist-related content is defined as any video that contains supremacist and provocative religious content that propagates violence. Those videos will not be recommended and warnings will be issued to their uploaders before removal. Moreover, they will expand the number of “experts in YouTube’s Trusted Flagger programme” because there is no denial in the importance of the human factor to help identify and eliminate violent videos on Youtube.
“We will expand this programme by adding 50 expert NGOs to the 63 organisations who are already part of the programme, and we will support them with operational grants. This allows us to benefit from the expertise of specialised organisations working on issues like hate speech, self-harm, and terrorism. We will also expand our work with counter-extremist groups to help identify content that may be being used to radicalise and recruit extremists, ” according to the blog.
As for internet users who watch videos posted by extremists, such as ISIS/DAESH, and are considered as potential recruits will be automatically directed to anti-terrorist videos as a form of special advertising in the hope to change their minds about joining terrorist groups, such as: ISIS.
“YouTube will expand its role in counter-radicalisation efforts. Building on our successful Creators for Change programme promoting YouTube voices against hate and radicalisation, we are working with Jigsaw to implement the “Redirect Method” more broadly across Europe. This promising approach harnesses the power of targeted online advertising to reach potential Isis recruits, and redirects them towards anti-terrorist videos that can change their minds about joining.”
Other social media giants, including Facebook and Twitter will collaborate with Google and its subsidiary, Youtube to found an “international forum” to gather tech minds and support small tech companies to develop advanced technologies to eradicate terrorism and extremism from the internet.
“Collectively, these changes will make a difference. And we’ll keep working on the problem until we get the balance right.”