Skip to main contentSkip to navigationSkip to navigation
Granary Square
Granary Square, part of the King’s Cross development where facial recognition is being used in CCTV systems. Photograph: John Sturrock
Granary Square, part of the King’s Cross development where facial recognition is being used in CCTV systems. Photograph: John Sturrock

Regulator looking at use of facial recognition at King's Cross site

Information commissioner says use of the technology must be ‘necessary and proportionate’

The UK’s privacy regulator said it is studying the use of controversial facial recognition technology by property companies amid concerns that its use in CCTV systems at the King’s Cross development in central London may not be legal.

The Information Commissioner’s Office warned businesses using the surveillance technology that they needed to demonstrate its use was “strictly necessary and proportionate” and had a clear basis in law.

The data protection regulator added it was “currently looking at the use of facial recognition technology” by the private sector and warned it would “consider taking action where we find non-compliance with the law”.

On Monday, the owners of the King’s Cross site confirmed that facial recognition software was used around the 67-acre, 50-building site “in the interest of public safety and to ensure that everyone who visits has the best possible experience”.

It is one of the first landowners or property companies in Britain to acknowledge deploying the software, described by a human rights pressure group as “authoritarian”, partly because it captures images of people without their consent. Canary Wharf is also interested in deploying the technology.

Facial recognition

Hannah Couchman, a policy and campaigns officer from Liberty, said it amounted to “a disturbing expansion of mass surveillance that threatens our privacy and freedom of expression as we go about our everyday lives”. She added: “There has been no transparency about how this tool is being deployed and who it is targeting.”

Concerns were also voiced by the regulator responsible for overseeing the use of CCTV. Tony Porter called on ministers to introduce “robust and transparent legislation” in order “to protect the rights and privacy of people going about their business”.

The King’s Cross site, mostly to the north of the mainline terminus, includes the headquarters of Google and the Central Saint Martins art school. The Guardian is also sited on the fringe of the development.

Cameras using the software are used by police forces to scan faces in large crowds in public places such as streets, shopping centres, football stadiums and music events such as the Notting Hill carnival. Images harvested can then be compared to a database of suspects and other persons of interest.

The use of facial recognition software by South Wales police is being challenged in the courts by an office worker in Cardiff, in a test case backed by Liberty whose result is keenly anticipated by regulators and across the industry.

Despite that, the Welsh force unveiled plans to give officers facial recognition apps on their phones so officers can identify suspects in the street without having to take them back to a police station, technology whose deployment was recently backed by Sajid Javid when he was home secretary.

But there are also doubts about the accuracy of the technology. Researchers at the University of Essex were invited by the Met police to study the force’s trials of its facial recognition software and concluded that only in 19% of the 42 cases studied could they be sure the force had identified the right person.

Experts have also raised concerns that facial recognition technology has a racial bias, that it is less effective in accurately distinguishing black people – although few studies have been conducted by the authorities in the UK.

Last year a researcher at MIT’s Media Lab in the US concluded that software supplied by three companies made mistakes in 21% to 35% of cases for darker-skinned women. By contrast, the error rate for light-skinned men was less than 1%.

King’s Cross would not provide any detail about the software used by its surveillance cameras, beyond saying in a statement that they used a “number of detection and tracking methods, including facial recognition” across the development.

There were also “sophisticated systems in place to protect the privacy of the general public”, a spokesperson for King’s Cross added in a statement, although again these were not spelled out.

Daragh Murray, a senior lecturer in human rights law at the University of Essex, said it was difficult to debate the ethics or legality of what King’s Cross was doing because “it is very difficult to adjudicate in the abstract. They’ve given us very little information as to what they doing.”

Last year a trial of facial recognition software at the Trafford Centre in Greater Manchester was halted by regulators, because every visitor was being checked against a set of 30 suspects and missing persons provided by the police. Regulators were concerned the trial monitored too many ordinary people.

The King’s Cross development is owned by a consortium of Argent, a property developer, Hermes Investment Management, on behalf of BT Pensioners, and the Australian pension scheme AustralianSuper.

More on this story

More on this story

  • Chinese security firm advertises ethnicity recognition technology while facing UK ban

  • Labour peers urge greater scrutiny of plans for police camera drones

  • UK police to lose phone and web data search authorisation powers

  • Court to hear challenge to GCHQ bulk hacking of phones and computers

  • Surveillance used to be a bad thing. Now, we happily let our employers spy on us

  • Palantir: the ‘special ops’ tech giant that wields as much real-world power as Google

  • EU judges may be asked to rule on legality of UK surveillance powers

  • Tim Farron warns of win for terrorists if web is made surveillance tool

Most viewed

Most viewed