Connect with us

Business

London attack: Tech firms fight back in extremism row

Technology companies have defended their handling of extremist content following the London terror attack.

Prime Minister Theresa May called for areas of the internet to be closed because tech giants had provided a “safe space” for terrorist ideology.

But Google said it had already spent hundreds of millions of pounds on tackling the problem.

Facebook and Twitter said they were working hard to rid their networks of terrorist activity and support.

Google, which owns Youtube, along with Facebook, which owns WhatsApp, and Twitter were among the tech companies already facing pressure to tackle extremist content.

That pressure intensified following Saturday night’s attack, which killed seven people and injured 48. The so-called Islamic State group has claimed responsibility for the attack.

Speaking outside Downing Street on Sunday, Mrs May said: “We cannot allow this ideology the safe space it needs to breed.

“Yet that is precisely what the internet, and the big companies… provide.”

Culture Secretary Karen Bradley said tech companies needed to tackle extremist content, in a similar way to how they had removed indecent images of children.

“We know it can be done and we know the internet companies want to do it,” she told the BBC on Monday.

Google said it had invested heavily to fight abuse on its platforms and was already working on an “international forum to accelerate and strengthen our existing work in this area”.

The firm added that it shared “the government’s commitment to ensuring terrorists do not have a voice online”.

Facebook said: “Using a combination of technology and human review, we work aggressively to remove terrorist content from our platform as soon as we become aware of it – and if we become aware of an emergency involving imminent harm to someone’s safety, we notify law enforcement.”

Meanwhile, Twitter said “terrorist content has no place on” its platform.

Home Secretary Amber Rudd said on Sunday that tech firms needed to take down extremist content and limit the amount of end-to-end encryption that terrorists can use.

End-to-end encryption renders messages unreadable if they are intercepted, for example by criminals or law enforcement.

The Open Rights Group, which campaigns for privacy and free speech online, warned that politicians risked pushing terrorists’ “vile networks” into the “darker corners of the web” by more regulation.

The way that supporters of jihadist groups use social media has changed “despite what the prime minister says”, according to Dr Shiraz Maher of the International Centre for the Study of Radicalisation (ICSR) at King’s College London.

They have “moved to more clandestine methods”, with encrypted messaging app Telegram the primary platform, Dr Maher told the BBC.

Professor Peter Neumann, another director at the ICSR, wrote on Twitter: “Blaming social media platforms is politically convenient but intellectually lazy.”

However, Dr Julia Rushchenko, a London-based research fellow at the Henry Jackson Centre for Radicalisation and Terrorism, told the BBC that more could be done by tech giants to root out such content.

She felt that the companies erred on the side of privacy, not security. “We all know that social media companies have been a very helpful tool for hate preachers and for extremists,” Dr Rushchenko said.

Investors suggested that tech firms would be more willing to take further action against extremist content if shareholders and advertisers pressured them to do so.

Jessica Ground, a UK fund manager at Schroders, told the BBC: “It’s going to be an interesting debate how you put the pressure points. It could be the money rather than the governments.”

Simon Howard, chief executive of UKSIF – the UK Sustainable Investment and Finance Association, said: “We’ll need all the technology companies to do a bit more and we’ll have to decide what the UK legal framework in which they do that is.”

Continue Reading