Facial recognition technology (FRT) is a widespread, versatile biometric security system that goes far beyond just unlocking phones. Private and government sectors have integrated FRT into various functions, from airport security checks to cashless payment options.
Despite the rapid adoption of FRT, tech leaders are concerned about whether it’s safe to use.
Here’s what you should know about the privacy risks involved with the proliferation of FRT.
The Privacy Risks Of Face ID Integration
FRT is a modern biometric verification system that recognizes unique facial features through image scanning. It’s widely known for its speed and accuracy. Advanced FRT-enabled tools can identify individuals in crowded, high-foot-traffic areas without even stopping them for verification. But what are the dangers of this technology?
1. Organizations Misusing FRT Data
Many companies could benefit from using FRT, be it to verify customer identity, track customer footfall, or similar.
But many people feel unsafe undergoing facial recognition scanning at public and private establishments.
This is because of concerns surrounding various entities collecting their facial biometric data. Even if the collecting organization is legitimate, insecure cybersecurity systems on their end could still result in data breaches.
Additionally, the organization may misuse this kind of customer data. Large organizations are no strangers to such practices, with companies like Amazon facing lawsuits due to their mishandling or unethical collection of user information.
2. Fraudsters Exploiting FRT Data
It’s not just big companies that pose a risk here. FRTs collect highly sensitive facial biometric data that crooks could exploit to extract more personally identifiable information, commit fraudulent transactions, or spread damaging deep fake videos. Mindlessly divulging this data puts you at unnecessary risk.
To mitigate these issues, screen everyone requesting your facial biometric data. Assess the organization’s susceptibility to data breaches, double-check if they accept other identity verification options, and ensure you can trust their team.
Contact the FTC immediately if you find yourself the victim of a data breach. It will help you take legal action against the responsible individuals.
3. FRT’s Lack of Strict Regulation
The rapid adoption of FRT outpaces the founding of explicit legal regulations. Public and private sectors are integrating facial recognition into their daily workflow faster than government agencies can keep up. No central institution closely regulates FRT at the time.
Apart from standard privacy guidelines, there should be clear limitations detailing when and where it’s safe to use FRT. Setting them up in public establishments comes across as intrusive. Individuals might feel safer if they can opt for alternative identity verification or biometric scanning options.
4. More Sectors Are Quickly Adopting FRT
FRT is a versatile system that serves several private, commercial, and national security uses. It could eventually become the universal standard for biometric scanning. In fact, China has been using FRT-enabled payments at public transportation systems, malls, and convenience stores for years. More countries might adopt similar payment systems.
While convenient, widespread and uncontrolled adoption gives cybercriminals more opportunities to misuse FRT. Consumers might also become careless in sharing biometric data. This also links back to FRT’s lack of airtight regulations.
5. FRT-Equipped Systems Are Prone to Biases
A 2020 Harvard report explains how biases cause facial recognition systems to misidentify individuals, especially those with darker skin tones. These verification errors perpetuate social inequalities. Carelessly using biased, poorly-trained FRT systems for national security and law enforcement will merely increase the likelihood of racial profiling.
Like AI biases, addressing facial recognition biases requires sophisticated image processing and diverse sample datasets. FRT systems need datasets with a broader range of facial features to combat bias.
6. Sites Can Monitor Minors Through FRT
Some platforms use FRT to prevent kids from accessing restricted content. For instance, a 2021 BBC article states that the Chinese gaming company Tencent enforces curfews for minors through facial recognition, locking out underage players from 10 PM to 8 AM daily.
Although well-intended, these types of monitoring systems aren’t ready for public use. Crooks could exploit the biometric data that FRTs carry and weaponize them against innocent, unsuspecting kids.
7. Deepfake Technologies Are Advancing
Advancements in AI deepfake technologies threaten the reliability and functionality of FRT systems. Modern text-to-image models create super-realistic images. It’s not unlikely for skilled crooks to exploit these AI outputs and use them in manipulating facial recognition systems.
Where Is Facial Recognition Technology Used?
People encounter facial recognition scanning more often than they realize. It has several applications in legal, commercial, private, and government sectors—you can’t eliminate FRT from your life overnight.
Start tracking everyday systems that use facial recognition. While most are avoidable, note that some entities might legally require you to provide your facial biometric data.
1. Unlocking Smartphones and Apps
Alphanumeric and custom numeric codes are less convenient than facial recognition but more secure. Cracking complex strings through brute-force attacks could theoretically take thousands of years. Alternatively, crooks can bypass facial biometric data by generating deepfake images or holding the device against your face.
We suggest turning off your phone’s in-platform facial recognition system. You can disable Face Unlock on Android devices under Security & Privacy and disable Face ID on Apple devices under Face ID & Passcode.
2. Airport Boarding and Border Checks
Countries worldwide, including the US, have started using facial recognition technologies at airports. Biometric scanning streamlines the screening and boarding process. Lines move much faster since airport security personnel no longer check passengers for IDs individually.
Most major US airlines already use biometric scanning. Contact the Transportation Security Administration (TSA) directly if you have questions about the FRT systems they use.
3. Law Enforcement and Criminal Tracking
Law enforcement agencies use facial recognition systems to identify criminals, suspects, and defendants. They carry biometric data on all criminal suspects. You must comply with police officers that require FRT, but you can also take legal action against them if they misidentify you as another suspect.
4. Cashless Payment Options
FRT-enabled payment options are gradually expanding outside of China. CNBC reports say Mastercard even rolled out a new payment feature that uses facial and fingerprint biometric scanning in 2022.
The general public still has reservations about it. However, you can expect more payment processing companies to accommodate facial recognition payments as AI and virtual reality technologies evolve.
5. Tracking Employee Attendance
Most companies monitor employees through biometric time trackers nowadays. And while fingerprint scanning is more widely used, many employers have recently upgraded to FRT systems.
Talk to your HR managers about the privacy risks of facial recognition scanners. They might accommodate alternative monitoring tools if several other employees are also unsure about divulging facial biometric data.
Should You Avoid Using Facial Recognition Technologies?
You’ll encounter FRT applications regardless of how you feel about them. Private companies, government sectors, and law enforcement agencies will continue integrating facial recognition into their daily workflow. You must take a proactive approach and avoid it.
That said, FRT isn’t inherently dangerous. Although it presents several privacy issues, it’s also a quick, convenient system that outperforms other biometric verification technologies. Consider comparing FRT with fingerprint and retina scanning as well.