Join getAbstract to access the summary!

Resisting the Rise of Facial Recognition

Join getAbstract to access the summary!

Resisting the Rise of Facial Recognition

Growing use of surveillance technology has prompted calls for bans and stricter regulation.

Nature,

5 min read
4 take-aways
Audio & text

What's inside?

Facial recognition technology has the potential for both good and evil – and needs clear regulations.


Editorial Rating

9

Qualities

  • Comprehensive
  • Eye Opening
  • Hot Topic

Recommendation

Facial recognition technology (FRT) is used for smartphone security, passport control, police investigations and many other applications. As the rollout of FRT accelerates, regulations to control its application are playing catch-up – and are mostly insufficient to safeguard people’s privacy and freedom. Antoaneta Roussi and Richard Van Noorden’s article is a disconcerting and important read.

Take-Aways

  • The COVID-19 pandemic accelerated the rollout of automated facial recognition technology (FRT).
  • Researchers and rights activists are questioning the legality of FRT’s use.
  • The easy availability of people’s photos online makes controlled inclusion in watch lists impossible.
  • In the hands of governments and law enforcement, FRT is a powerful tool.

Summary

The COVID-19 pandemic accelerated the rollout of automated facial recognition technology (FRT).

People have become familiar with the use of facial recognition technology (FRT) at borders, on their smartphones and in criminal investigations. Increasingly, cities all over the world are also implementing this technology to prevent and solve crimes.

Researchers worry that the use of live-surveillance technologies is likely to linger after the pandemic.”

This wider rollout has accelerated since the start of the COVID-19 pandemic. Some countries such as Russia, China, South Korea and India, have set up FRT video surveillance systems in their cities to help them monitor people’s movement and enforce lockdowns.

Researchers and rights activists are questioning the legality of FRT’s use.

There is currently little transparency around how police and governments use FRT and the biometric data it collects. Information on where data is stored and who can access it is sparse in most cases. While laws such as the European Union’s General Data Protection Regulation (GDPR) stipulate that people have to give their consent to data collection, there are generally loopholes. For example, GDPR allows for data gathering if it is in the public interest.

It’s very hard for an individual to understand the risks of consenting to facial surveillance…And they often don’t have a meaningful way to say ‘no’.” (Woodrow Hartzog, a computer scientist and law professor at Northeastern University in Boston)”

Similarly, police can use live video surveillance if they consider it necessary for safety reasons. Many researchers and tech companies are requesting banning FRT until stricter regulations are available that clearly set out what is permissible.

The easy availability of people’s photos online makes controlled inclusion in watch lists impossible.

FRT matches the information it collects against databases of images. Many people are not even aware that they might be on a searchable directory. For example, in the United States, the police can run face recognition against driver’s license databases. The easy accessibility of people’s photos online also means that anyone could scrape these images and build a facial recognition database. While most social media companies specify in their terms of service that harvesting images from their sites is illegal, monitoring and prosecuting breaches is difficult.

In the hands of governments and law enforcement, FRT is a powerful tool.

Surveys suggest that a significant minority of the population is concerned that the police and the state might exploit FRT to curb people’s freedoms and suppress dissent and protest. There is very little to no evidence that FRT surveillance in smart or safe city programs results in fewer crimes and increased safety.

Some people…have taken to wearing masks or camouflage-like ‘dazzle’ makeup to try to confuse facial recognition systems. But their only ‘opt-out’ option is to not turn up.”

Also, image mismatches are still common, in particular for darker-skinned people and women. This can easily lead to discrimination and biased decision-making.

About the Authors

Antoaneta Roussi is a freelance journalist and writer based in Nairobi, Kenya. Additional reporting by Richard Van Noorden, a features editor at Nature Research.

This document is restricted to personal use only.

Did you like this summary?

Read the article

Comment on this summary