Prime minister Keir Starmer has announced that the government will establish a “national capability” to deal with violent disorder in the wake of racist rioting across England, but campaigners and civil society groups say they are concerned about the surveillance implications of the initiative and its damaging effect on wider civil liberties.
Known as the National Violent Disorder Programme, the government said that it will bring together the best policing capabilities from across the UK to share intelligence on the activity of violent groups so that authorities can swiftly intervene to arrest them.
The programme announcement follows the outbreak of violent far-right riots in more than a dozen English towns and cities, which specifically targeted mosques, hotels housing asylum seekers, immigration centres, and random people of colour.
“We will establish a national capability across police forces to tackle violent disorder. These thugs are mobile, they move from community to community, and we must have a policing response that can do the same,” said Starmer.
“Shared intelligence, wider deployment of facial-recognition technology, and preventive action – criminal behaviour orders to restrict their movements before they can even board a train, in just the same way we do with football hooligans.”
A government press release provided more detail, noting that “local insight and data” will be used to gain a national understanding of how far-right organisers are operating, which will include the British Transport Police alerting where they see a spike in train ticket sales that could be linked to organised violent disorder.
It added that the programme “will also consider how we can deploy facial-recognition technology, which is already used by some forces, more widely across the country. This will mean criminals can be targeted, found and brought to justice quickly.”
The programme will further support the swift deployment of “surge teams” to bolster police forces faced with intelligence that suggests organised violence will take place in a particular area.
“Communities have a right to feel safe without deliberate organised violence or thuggery in our streets. Criminals need to face the full force of the law and today we made clear that the police have our strong support in keeping the streets safe,” said home secretary Yvette Cooper.
“We will work with senior police officers across the country to make sure there is rapid intelligence sharing and swift action to stop violent disorder and make sure criminals pay the price.”
Chair of the National Police Chief’s Council (NPCC) Gavin Stephens added: “We look forward to working with government and receiving more details on the creation of a National Violent Disorder Programme and further work on tools such as live facial recognition.”
Computer Weekly contacted the Home Office for more detail about the programme, including whether it was referring to live or retrospective facial-recognition deployments and what information sharing deficiencies between forces it was trying to fix, but received no direct response.
Programme concerns
Despite acknowledging the clearly far-right nature of the current disorder during a press conference announcing the programme, Starmer also said that the new initiative would be used to identify agitators from all parts of the ideological spectrum.
According to the Network for Police Monitoring (Netpol) – which monitors and resists policing that is excessive, discriminatory or threatens civil liberties – the programme will clearly be used to target organised anti-fascists that went out to oppose the far-right across the country on Wednesday 7 August.
“If you’ve been following the #spycops inquiry, you’ll know political policing units have a long history of obsessively targeting intelligence gathering against organised anti-fascists,” it posted on X (formerly Twitter). “That is certainly what the latest unit, which Starmer plans to expand, will have been told to do.”
In a follow up blog post published 13 August 2024, Netpol added that any prospect of Labour rowing back anti-protest police powers created by the Public Order Act 2023 (which placed significant new restrictions on those seeking to protest and lowers the bar for what is considered “serious” disruption) is now dead in the water.
“What we can expect instead is more funding for – and more central coordination of – protest intelligence gathering, including live-facial recognition,” it said.
“The prime minister actively supports the police’s dismissal of objections to the intrusion, unreliability and discriminatory nature of facial recognition, and the infringement on civil liberties that the extensive filming of demonstrations entails. We may also see more public order legislation and a greater willingness to quickly deploy riot units onto the streets.
“After the immediate crisis recedes, expanded police surveillance is just as likely to focus on movements for social, racial and climate justice as it does on the far right.”
Silkie Carlo, director of privacy campaign group Big Brother Watch, described the prime minister’s plan to roll out more facial recognition in response to recent disorder as “alarming”, saying it threatens rather than protects democracy.
“This AI surveillance turns members of the public into walking ID cards, is dangerously inaccurate and has no explicit legal basis in the UK,” she said. “To promise the country ineffective AI surveillance in these circumstances was frankly tone deaf and will give the public absolutely no confidence that this government has the competence or conviction to get tough on the causes of these crimes and protect the public.”
Both Netpol and Big Brother Watch – along with 28 other civil society groups – have now signed a letter to Starmer voicing their “serious concerns” over facial-recognition surveillance.
“In times of crisis, upholding the rule of law is paramount – however, live facial recognition operates in a legal and democratic vacuum, and it is our view that its use for public surveillance is not compatible with the European Convention on Human Rights,” they wrote.
“We join you in condemning the racist, violent and disorderly scenes across the country. However, to rush in the use of technology which has a seriously negative bearing on our rights and freedoms would not only fail to address the causes of this dangerous violence, but set a chilling precedent.”
Computer Weekly contacted the Home Office about all of the concerns raised, and whether it still holds the view that facial-recognition technology is adequately covered by existing legislation under its new Labour administration, but received no direct response.
“We constantly review the use of facial recognition technology by police to keep our streets safe and ensure we restore public confidence in our police,” said a Home Office spokesperson.
Ongoing police tech concerns
In November 2023, the outgoing biometrics and surveillance camera commissioner for England and Wales, Fraser Sampson, questioned the crime prevention capabilities of facial recognition, arguing that the authorities were largely relying on its chilling effect, rather than its actual effectiveness in identifying wanted individuals.
He also warned of generally poor oversight over the use of biometric technologies by police, adding that there are real dangers of the UK slipping into an “all-encompassing” surveillance state if concerns about these powerful technologies aren’t heeded.
Sampson also previously warned in February 2023 about UK policing’s general “culture of retention” around biometric data, telling Parliament’s Joint Committee on Human Rights (JCHR) that the default among police forces was to hang onto biometric information, regardless of whether it was legally allowed.
He specifically highlighted the ongoing and unlawful retention of millions of custody images of people never charged with a crime, noting that although the High Court ruled in 2012 that these images must be deleted, the Home Office – which owns most of the biometric databases used by UK police – said it can’t be done because the database they are held on has no bulk deletion capability.
A prior House of Lords inquiry into UK policing’s use of advanced algorithmic technologies – which explored the use of facial recognition and various crime prediction tools – also found in March 2022 that these tools pose “a real and current risk to human rights and to the rule of law. Unless this is acknowledged and addressed, the potential benefits of using advanced technologies may be outweighed by the harm that will occur and the distrust it will create.”
In the case of “predictive policing” technologies, Lords noted their tendency to produce a “vicious circle” and “entrench pre-existing patterns of discrimination” because they direct police patrols to low-income, already over-policed areas based on historic arrest data.
On facial recognition, they added it could have a chilling effect on protest, undermine privacy and lead to discriminatory outcomes.
After a short follow-up investigation looking specifically at facial recognition, Lords noted UK police were expanding their use of facial-recognition technology without proper scrutiny or accountability, despite lacking a clear legal basis for their deployments. They also found there were no rigorous standards or systems of regulation in place to control forces’ use of the technology.