Britain’s spymasters are using artificial intelligence (AI) to predict whether people who watch violent videos online might have the potential to become dangerous terrorists.
The head of MI5 has revealed that the organisation’s experts have deployed AI and machine learning tools that can sift through videos being watched by terrorist suspects to see what level of risk individuals pose to society.
It will allow computer programmes to do the work once carried out by hundreds of spies. The move will also mean that staff will be spared having to watch disturbing images of beheading and torture which have the capacity to cause post-traumatic stress disorder.
Details of MI5’s use of AI and machine learning were revealed by Ken McCallum, the organisation’s director general, to students at Glasgow University.
He was explaining why MI5 wants to recruit maths graduates – as he was himself – and how they will become the spies of the future.
In the speech, delivered in June this year, he revealed that when he became a spy, converting raw audio obtained from phone taps and bugging was “a painstaking task for hundreds of professionals sat wearing headphones”.
But he added: “Today, we want to automate as much as possible of that foundational conversion of audio into searchable text – freeing up our analysts to focus on extracting the intelligence insights that count.
“But the challenging nature of our audio data means that commercial speech-to-text solutions often can’t do what we need – at least not with the precision that high-stakes work rightly demands.
“So our data scientists build, train and deploy our own machine learning models, continually improving them based on real feedback – giving our people a huge productivity boost, enabling them to apply their analytical skills to the true secrets and mysteries.
“This interplay between maths, computing science, engineering and human expertise is a critical dynamic for us.
“Building a model is just the first step; the real test is using it on live operations. Applied maths at its sharpest.”
‘Another grateful nod to Alan Turing’
He then added that AI was being used to monitor whether potential terrorists or individuals on the path to violent radicalisation posed a real risk to society.
He said: “A second example of applying AI is in detecting violence in images. Understanding whether, say, a prolific contributor to extreme Right-wing online forums is also watching graphic beheading videos can help in assessing the level of risk they might pose.
“But we don’t need or want to view all the sport they’re also watching. So with another grateful nod to Alan Turing, we again turn to machine learning.
“We have put in place automated capabilities to detect violent material within large data streams.”
Colonel Richard Kemp, a counter-terrorism expert and former commander of infantry troops in Afghanistan, said the move will help save lives.
He told The Telegraph: “AI will give our intelligence services a vital step forward in identifying and assessing threats at home and abroad and this will help them save lives.
“Our intelligence agencies also know that terrorists and others who threaten us will be using AI to improve their own capabilities as well.
“As always, the world of intelligence is a race to keep ahead as all sides harness the latest technology.
“I was once involved in analysing vast quantities of national intelligence every day. AI will make our successors today much more effective in this work.
“There is, though, always the risk that technology is seen as a panacea and it remains important that more traditional intelligence tools are not neglected.
“The other benefit of this is that individuals will not have to sift through hundreds of explicit and violent images.”
Leave a Comment