Governance & Risk Management , Next-Generation Technologies & Secure Development , Privacy

Facial Recognition Use in UK Continues to Stir Controversy

Metropolitan Police Gave Images to Developer, Mayor Says
Facial Recognition Use in UK Continues to Stir Controversy

The use of facial recognition technology within a fashionable section of London is continuing to stir controversy with an admission this week that the Metropolitan Police Service shared images with a developer as a part of a trial run of a surveillance system.

London's mayor, Sadiq Aman Khan, says that the city's police service shared images with the developer of a 67-acre section of King's Cross, which includes a world-famous railway station. The partnership between law enforcement and the developer, King's Cross Central Limited Partnership, ended in March 2018, the mayor says. But police officials originally claimed they were not involved in this pilot program.

See Also: OnDemand | Understanding Privacy Issues with Generative AI

"As a matter of urgency, I have asked for a report from the [Metropolitan Police Service] on this concerning development and on their wider data-sharing arrangements, including what information has been shared and with whom," Khan says.

Crime Prevention Measure

Argent, the developer behind the King's Cross Central Limited Partnership, had been using facial recognition technology in the neighborhood's CCTV system to scan pedestrians near the King's Cross railway station as part of a crime prevention effort, according to the Guardian.

While much of the King's Cross development is privately owned, most of it is open to the public. Many city residents, as well as privacy advocates, say the use of facial recognition technology without the public's knowledge is an invasion of privacy (see: Use of Facial Recognition Stirs Controversy).

The U.K. Information Commissioner's Office - Britain's chief privacy watchdog – has launched an investigation into the developer’s use of facial recognition technology. The ICO has the ability to suggest privacy violation fines under the European Union's General Data Protection Regulation.

Facial Recognition in Use

The controversy over the use of facial recognition technology started in August, when several media reports surfaced that showed the developer of the King's Cross property had used the technology.

The issue then resurfaced this week when London's mayor disclosed the involvement of the Metropolitan Police force.

A spokesperson for Argent issued a statement earlier this week saying that only two facial recognition technology cameras, which the company claims only covered a single location at King's Boulevard, were operational between May 2016 and March 2018. Since then, the company has stopped using the technology in that area.

The statement also notes that during that time, data processed through the facial recognition technology system was regularly deleted, with the final removal of personal data taking place in March 2018.

"The [facial recognition technology] system was never used for marketing or other commercial purposes. The system was used only to help the Metropolitan Police and British Transport Police prevent and detect crime in the neighborhood and ultimately to help ensure public safety," according to the statement.

Although the statement from the developer tries to minimizes how much facial recognition technology the company used, John Hollywood, an analyst with the RAND Corporation, tells Information Security Media Group that the statement raises more questions than it answers. Specifically, Hollywood asks why were only two cameras used and was all the data fully purged at the end of the test run? In addition, he notes that this type of technology does not help that much in crime prevention.

"For prevention, was the model that [Metropolitan Police] shared faces of high-risk / wanted persons with King’s Cross? With what agreements, requiring what security, privacy, and civil rights protections? What enforcement actions, if any, did the use of facial recognition lead to?" Hollywood notes. "The emerging technology strongly associated with crime detection is video analytics, not facial recognition."

Previously, the BBC reported that the Metropolitan Police and British Transport Police both denied any involvement with the developer’s use facial recognition technology during the pilot program. That changed this week with the acknowledgement that the Metropolitan Police gave images to the developer as part of the test run.

By Aug. 15, however, the IOC had launched an investigation into the use of the technology, according to the BBC and other media reports. The ICO investigation is continuing and the developer says the company is cooperating, according to the statement.

Meanwhile, the BBC reports this week that Britain's surveillance camera commissioner, who oversees the use of surveillance cameras within the country, is also investigating the use of facial recognition technology within King's Cross.

Seeking Clarification

In August, Robert Evans, a partner at Argent, tried to clarify how the technology was being used.

In a letter to the mayor's office, Evans wrote that the facial recognition system tied to the CCTV system wass designed to run in the background, looking for matches against a small number of so-called "flagged" individuals. This could be, for example, a person who may have committed a crime or a missing person, according to the BBC.

Evans' letter also noted that faces were automatically blurred out when the footage was played back or captured. The system only stored the images of flagged individuals, according to the BBC.

Privacy Concerns

Over the past several years, the use of facial recognition - along with other technologies such as machine learning, artificial intelligence and big data - has stoked privacy concerns.

One of the biggest threats associated with facial recognition data is its use for identity theft, which is a direct violation of GDPR. Some of the other challenges include data harvesting, unauthorized tracking and misuse of data for credential stealing (see: Facial Recognition: Big Trouble With Big Data Biometrics).

Last month, Sweden's Data Protection Authority issued its first fine for violations of GDPR after a school launched a facial recognition pilot program to track students' attendance without proper consent (see: Facial Recognition Use Triggers GDPR Fine).

Privacy concerns also are being raised about ZAO - a new Chinese app available through Apple's App Store that has gone viral since its release, according to Reuters. The app allows users to swap their faces with celebrities, sports stars or anyone else in a video clip.

The use of facial recognition technology is also making its way into political debates. For instance, in the U.S., Independent Vermont Sen. Bernie Sanders, a candidate for the Democratic nomination for president, called for a ban on the use of facial recognition technology by law enforcement agencies. Meanwhile, local governments in Oakland and San Francisco have stopped local police from using the technology.


About the Author

Apurva Venkat

Apurva Venkat

Special Correspondent

Venkat is special correspondent for Information Security Media Group's global news desk. She has previously worked at companies such as IDG and Business Standard where she reported on developments in technology, businesses, startups, fintech, e-commerce, cybersecurity, civic news and education.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing govinfosecurity.com, you agree to our use of cookies.