Technology

Posted on September 20, 2017 by staff

Google anti-terror funding as UK urges tech firms to act

Technology

Google is to provide £1 million of funding for anti-extremism projects in the UK as Prime Minister Theresa May prepares to call for tech firms to do more to combat the spread of extremist material.

The Google funding will be handed out in partnership with the Institute for Strategic Dialogue and is part of a $5m pot to tackle the problem globally.

An independent board including academics, policymakers, educators, representatives from creative agencies, civil society and the technology sector will consider a first round of applications for grants between £2,000 and £200,000 in November.

Google said the funding would support “technology-driven solutions, as well as grassroots efforts such as community youth projects that help build communities and promote resistance to radicalisation”.

Kent Walker, general counsel at Google, added: “By funding experts like ISD, we hope to support sustainable solutions to extremism both online and offline.

“We don’t have all the answers, but we’re committed to playing our part. We’re looking forward to helping bring new ideas and technologies to life.”

ISD chief executive Sasha Havlicek said: “We are eager to work with a wide range of innovators on developing their ideas in the coming months.”

Prime Minister May will address the United Nations general assembly on Tuesday and call for the likes of Google, Facebook and Twitter to take down ‘terrorist’ material within two hours.

She will also host a meeting with world leaders and tech giants where she is expected to urge them to go “further and faster” in developing artificial intelligence tools capable of discovering terror-related propaganda.

Twitter said on Tuesday it had taken down 300,000 terror accounts in the first six months of the year.

A G7 meeting on 20th October will decide whether tech firms are acting quickly enough.

Walker, who will represent the tech giants at the meeting with May, told BBC Radio 4’s Today programme: “Machine learning has improved but we are not all the way there yet.

“We need people and we need feedback from trusted government sources and from our users to identify and remove some of the most problematic content out there.

“Whenever we can locate this material, we are removing it. The challenge is once it’s removed, many people re-post it or there are copies of it across the web.

“And so the challenge of identifying it and identifying the difference between bomb-making instructions and things that might look similar that might be perfectly legal – might be documentary or scientific in nature – is a real challenge.”