The Covid-19 coronavirus pandemic has prompted a slew of biometric firms to update their facial-recognition technology to identify people wearing face masks.
In response to the sudden and widespread adoption of masks, facial-recognition companies across the globe have been busy developing a range of new algorithms that can identify people with “occluded” faces.
These new biometric offerings cover a wide range of use cases, from unlocking smartphones and making payments, to accessing transport and controlling borders, and use a variety of computational methods to make their identifications.
In the UK, for example, biometrics company Facewatch, which provides retailers and venues with facial recognition systems, announced on 11 May it had developed a “periocular” algorithm which allows its cameras to make identifications by scanning the area between a person’s cheekbones and eyebrows.
“This periocular recognition algorithm powered by the modern technological breakthroughs in deep learning and convolutional neural networks will enable subjects of interest to be recognised automatically, while enabling others to anonymously receive access to retailers using Facewatch,” said the announcement.
“It will further support people adhering to religious customs such as niqabs to be provided an equal user experience when engaging with identification technology.”
It added that the algorithm will be accessible to all existing license holders under active maintenance, and will be a standard feature included at no extra cost.
The Information Commissioner’s Office (ICO) has previously said that facial recognition presents a “threat to an individual’s right to have their data processed for purposes that they did not consent or for purposes they were not sufficiently made aware”.
According to Hazel Grant, a privacy expert at Fieldfisher, the adoption of masks could further complicate the matter of consenting to facial recognition.
“It could be argued that the mask shows a person does not consent. There are many stringent conditions about using consent – for example, the requirement that it is freely given and revocable. So trying to rely on consent for facial recognition may have significant problems, depending on the circumstances,” she said.
An explosion of upgrades
One of the first companies to deploy a system it claimed could identify individuals with hidden faces was Chinese artificial intelligence (AI) firm SenseTime, which reportedly deployed the technology as early as 11 February to control building access to the headquarters of South Korean IT company LG CNS.
While other Chinese firms, including Alibaba, Hanwang Technology, Telpo and Wisesoft, quickly followed suit with similar solutions throughout February and March, facial recognition developers in the rest of the world were not far behind.
Other companies that claim to have developed mask-circumventing facial recognition since the start of the pandemic include SAFR, Innovatrics, Herta, Speech Technology Centre, Remark Holdings, Rank One, and Corsight, among others.
Ofer Ronen, head of homeland security at Corsight’s parent company Cortica, told the Jerusalem Post that the race for better live facial-recognition (LFR) systems was “like the Cold War”.
“There are so many face-recognition companies worldwide and everyone is trying to get further ahead and push the technology limits more and more,” he said.
According to Hannah Couchman, policy and campaigns officer at human rights group Liberty, the pandemic should not be used as cover to normalise technology that threatens our rights, and warned against LFR being used to ease lockdown or track who has the virus through ‘immunity passports’, which the government is in talks with private companies to develop.
“Using it as part of any lockdown exit strategy is not just a dangerous expansion of state surveillance, it puts additional risk on people who will be under pressure to return to work, and therefore to sacrifice their rights,” she said.
“It’s important to explore the opportunities offered by new technology, but there is no digital sticking plaster to address this public health emergency. The way forward must not be presented as a trade-off between privacy and public health. The Government can, and must, create a public health strategy that maintains public trust and protects our rights.”
Despite the explosion of upgrades being made to these systems, any claims that they can accurately identify masked faces remain unvalidated.
This is because the organisation that provides industry-standard testing of all LFR technology accuracy, the US-based National Institute of Standards and Technology (NIST), is operating at reduced staff capacity due to the pandemic, and therefore cannot test them properly yet.
Even with NIST validation, however, the use of LFR by private companies remains a highly contentious issue, partly due to the opacity surrounding their own deployments of the tech, as well as their close collaboration with law enforcement entities.
In August 2019, the Kings Cross estate developer Argent admitted that LFR software was being used on its 67-acre site. A month later, it was revealed the developer had been using the software since May 2016, scanning tens of thousands of people without their knowledge or consent. The public outcry prompted it to halt its use of the technology.
Having initially denied any involvement, the Metropolitan Police Service (MPS) and British Transport Police eventually admitted to supplying the company with images for its facial-recognition database.
The episode prompted the ICO to launch an investigation into the use of LFR by private companies, including where the technology is used in partnership with law enforcement. It is still ongoing.
The ethics of police forces collaborating with private entities on the use of LFR technology is also under investigation by the Biometrics and Forensics Ethics Group, an arms-length advisory body to the Home Office.
In response to questions from Computer Weekly about whether the MPS is considering pausing its roll-out of live-facial recognition due to the massive number of people in London wearing masks, as reported in the Evening Standard, the MPS responded: “We are looking at any potential issues to establish how it may impact on future LFR deployments.”
When asked if the MPS, which has been deploying LFR operationally since January 2020, is considering solutions that circumvent people’s use of face masks, the force said it was “not discussing” the matter.
The UK’s two police forces that use LFR, the MPS and South Wales Police (SWP), both use technology supplied by Japan’s NEC Corporation, whose algorithm is now also “capable of matching partially obstructed faces”, according to a post shared on LinkedIn by the firm’s vice-president of federal business, Benji Hutchinson.
However, the company has said it does not aim to put the device on sale at full scale until the 2021 fiscal year, meaning there will be a delay before its potential use by UK law enforcement.
The MPS is already involved with a facial-recognition project to identify people with hidden faces.
According to the project’s website, face matching for automatic identity retrieval, recognition, verification and management (FACER2VM) will “develop unconstrained face recognition” with the goal of delivering a “step change” in the technology and making it “ubiquitous” by 2020.
While the MPS confirmed its involvement in the project at the time, it said no technology developed by the project had been introduced yet.
Companies involved in the project include IBM, Cognitec Systems, Digital Barriers and Onfido, which sit on the project’s user group alongside the MPS and the University of Hertfordshire.
“The purpose of the FACER2VM user group is to create a forum where the results of the project will be disseminated, and pathways to impact discussed and identified,” said the project website.
A global issue
The opacity of collaboration between the public and private sectors on LFR is not a problem exclusive to the UK, and extends around the globe.
According to a June 2019 report by David Kaye, the United Nations (UN) human rights councils mandated expert on freedom of expression, governments and the private sector are close collaborators in the market for a vast array digital surveillance tools, including facial recognition.
“Governments have requirements that their own departments and agencies may be unable to satisfy. Private companies have the incentives, the expertise and the resources to meet those needs. They meet at global and regional trade shows designed, like dating services, to bring them together. From there, they determine whether they are a match,” he said.
“The seller’s intentions may be legitimate. It may be that companies genuinely intend their products to be deployed for ‘lawful interception’ by authorised public authorities against legitimate targets, with the authorisation of judicial or other independent actors.
“However, this cannot be known for certain because every aspect of such collaboration – from due diligence and sales to end-user support – typically operates with limited oversight and transparency.”
Presenting his findings to the 41st session of the Human Rights Council on 26 June 2019, Kaye described the international situation as a “surveillance free-for-all in which states and industry are essentially collaborating in the spread of technology that is causing immediate and regular harm to individuals worldwide”.
Speaking at the Open Rights Group conference on 13 July, Kaye said there were two main aspects to the problem – a lack of controls on the export and transfer of surveillance technology, and a lack of legal frameworks for how governments use them – and called for companies to start undertaking “human rights impact assessments” whenever they enter new markets.