Police forces warned on adopting AI 'without consultation or ethical safeguards'

By agency reporter
May 12, 2020

Police forces risk undermining policing-by-consent and their relations with ethnic minority communities, a report warns, as new Freedom of Information requests (FOIs) reveal only one force which has adopted AI such as facial recognition consulted local communities about its use. 

A Force for Good? from the RSA’s Asheem Singh and Will Grimond, says that AI technology offers huge potential to improve policing, but must be carried out “for purposes of improving police work rather than simply as a cost-cutting measure.” 

The Royal Society for the encouragement of Arts, Manufactures and Commerce (RSA) says this is especially important in the context of the easing of the lockdown from Monday.  

FOIs of all police forces in the United Kingdom by the RSA found: 

  • The main use of new Artificial Intelligence (AI) technology by forces is facial recognition software.
  • Of the police forces reporting use of AI or Automated Decision Systems (ADS), just one reported carrying out public engagement. An FOI returned in March found that the Met, which began a programme of facial recognition in February, had no record of consulting the public, despite suggesting that this would take place alongside deployment.  
  • While most police forces reported not using AI or ADS, several forces reported using ‘predictive policing’, where statistical analysis influences the deployment of police forces, and South Wales and the Metropolitan Police forces have deployed live facial recognition programmes. 

 Report authors Singh and Grimond warn this could harm relations with particular communities, stating: “Racial and gender biases can be exacerbated by technologies as they are based on historic data, and we fear that a lack of transparency could undermine the principle of policing-by-consent.” 

The report also warns that the guidelines given to police staff are varied and often inadequate, not dealing with the specific implications of using AI and ADS. This patchwork approach also means that public consultation is rarely built-in to the procurement and deployment process. 

The police roll-out of technology such as facial recognition has not been without controversy. Last year South Wales Police faced a court battle against its use of facial recognition, and in March the Equalities and Human Rights Commission called for it to be halted until better scrutiny is available and the law has been improved. 

“Adopting new technologies without adequate cultural safeguards – especially around deliberation and transparency - risks storing up considerable problems for the future, around both community cohesion but also truly innovative technological uptake”, the authors conclude. 

The report comes amid a police-enforced lockdown of the UK in response to the Covid-19 pandemic. The authors warn that increased police powers mean that it is more important than even that police use of these technologies come with appropriate safeguards. 

A Force for Good? calls for the use of deliberation to provide scrutiny and inform the public on how AI and ADS is being used by the police, and for citizens’ juries on ethics in policing with balances for ethnicity and gender. 

Deliberative bodies have been trialled in various settings, including this year’s climate assemblies. Last year, the RSA published a toolkit based on the results of an initial round of deliberative bodies in Democratising Decisions about Technology

Asheem Singh, Headof the RSA’s Tech and Society programme, said: “Over the last few years we have seen a rapid proliferation of the use of technology by our police forces. Innovation is exciting and welcome, but there are causes for concern in the lack of public engagement that has come with these technologies. Racial and gender biases can be exacerbated by technologies as they are based on historic data: we need to talk about that.  

“Our findings indicate a lack of transparency and input from the public on how these new technologies are being used, which in turn undermines the principle of policing-by-consent. It’s fine to cut costs but not at the expense of the improvements forces have made in their relations with BME communities. Law enforcement should work with civil society groups to provide proper consultation around how AI and ADS is being used.  

“This has implications beyond policing. As lockdown begins to ease from today, we need to be sure that new tech is being deployed with all the public’s best interests in mind. We have models for deliberating and discussing these complex technological challenges. We want to ensure government has the tools to do its job – but that means ensuring that those tools are beyond reproach and consented to and trusted by all.” 

* A Force for Good? is available to download here

* Royal Society for the encouragement of Arts https://www.thersa.org/


Although the views expressed in this article do not necessarily represent the views of Ekklesia, the article may reflect Ekklesia's values. If you use Ekklesia's news briefings please consider making a donation to sponsor Ekklesia's work here.