Facial Recognition Technology in 2021: Masks, Bias, and the Future of Healthcare

Imagine walking into a hospital and within seconds a computer recognized your face, linked it to your appointment, and gave you customized directions to your appointment. When you arrived in the office, your identity was confirmed simply from your face limiting the paperwork you need to fill out. When you visit with your physician, she logs into her computer simply by looking at it. When you are in the hospital, remote monitoring of you during your stay is able to use your facial expressions to identify excessive pain and alert the nurse taking care of you. Best of all, there are no errors in patient identification!

Facial Recognition Technology (FRT) may have started as a science fiction dream, but over the past 20 years technological advances have made FRT possible in many industries. It has been increasingly utilized in healthcare for its perceived accuracy in identifying individuals and has been suggested as a way to improve the security of patient health information, improve hygiene with contactless applications, improve the security of employee access points, and as a way to gather patient and employee data. Since the Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2), or COVID-19, outbreak there has been even more excitement around the idea of contactless security such as FRT. However, the technology relies on mapping of facial features which is a challenge in the age of masks. Furthermore, FRT has inherent limitations and biases that make it a challenge to implement in an equitable manner. Here we briefly explore the history of FRT, proposed uses in healthcare, the effects of COVID-19 and masks on FRT, limitations of the current technology, and the necessary advancements needed to improve the adoption of FRT in healthcare.

History of facial recognition

The origin of FRT can largely be attributed to Woodrow Wilson Bledsoe and his work in the 1960s developing a system to identify faces from a database of thousands of photographs. His system required human input in measuring the distance between coordinates on a face (e.g. pupils, corners of the eye, width of the mouth, tip of the nose, etc.). These inputs would then be compared to standard inputs to account for variations in light, head angle, and other three-dimensional manipulations of the head. The result was a system where a computer used these measurements was able to consistently outperform humans in identifying matching photos in a fraction of the time.

FRT has come a long way since the 1960s. Largely funded by federal research funding, advances were made in basic research. In the early 1990s, the Defense Advanced Research Projects Agency (DARPA) and the National Institute of Standards and Technology (NIST) developed the Face Recognition Technology (FERET) program to incentivize commercial applications of FRT. Since the 1990s technological advancements in the consumer market have transformed what we envision as the possibilities for FRT. One of the biggest advancements has been the evolution of machine learning to mine larger and larger databases of information to make more accurate predictions using FRT. From uses in airports to identify at risk travelers to smartphone device security to applications that can develop avatars based on an image of a face, FRT’s uses have only been expanding.

Applications in healthcare

As the FRT market has grown, numerous stakeholders have identified ways to incorporate FRT into healthcare. Using the large data sets developed for research, machine learning has been utilized to identify genetic abnormalities based on facial dimensions. This same technology is also being used to monitor patients over time to detect subtle changes related to aging, pain, and emotion. One group has used FRT to verify patients prior to surgery and another to determine an individual patient’s risk for having difficult airway. Due to the sensitivity of FRT, research has shown it to be even more sensitive than clinician judgement leading to the possibility of using FRT to enhance clinical decision making. Most commonly, however, FRT is being used primarily for human identification and security. From identifying patients accessing PHI to employees being monitored for access to restricted areas of the hospital, FRT has the potential to improve the efficiency and security of our healthcare systems.

COVID-19 and masks

Since the outbreak of COVID-19, wearing masks has become crucial to protecting public health. They have become a part of the new normal and their requirements in many aspects of daily life presents a unique challenge for a system that is designed to identify individuals based off of facial mapping. Most FRT relies on data points from all aspects of the face including the nose, mouth, jaw angle, etc. which provides a specific challenge when identifying a face with a mask. Many people have already seen the difficulties of FRT when wearing a mask on their personal smartphones that use FRT. It was even reported that the Department of Homeland Security in May 2020 had concerns about the ability for FRT to correctly identify people wearing masks. This has led to a sprint by technology companies to identify alternative methods for FRT. Some companies have been trying to develop FRT that primarily relies on periocular measurements. However, the leading technology with an accuracy of 99.77% in a standardized database only had an accuracy of 39.55% when only analyzing the periorbital area which is similar to analyzing the area of the face not covered by a mask. [1]

To assist with the challenge of masks with FRT, Wuhan University released the Real World Masked Face Recognition data set in the spring of 2020. It is the largest database of masked faces which the researchers hope will help with the development of FRT in the age of masks. [2] However, while the database has over 10,000 individual faces, only 525 people were imaged while the rest were simulated. Nevertheless, it represents the first step to improving the technology as the COVID-19 pandemic continues to warrant waring a mask in public.

Disparities in FRT / limitations

2020 has also reignited the call for continued racial, gender, and sexual orientation equity. Healthcare settings need to continue to make sure that they are welcoming and inclusive of the communities they serve which includes making sure that the technology we utilize minimizes any bias towards our patients. Since many FRT platforms rely on machine learning from vast databases, they are especially prone to bias based on the faces in the database. Previous studies have shown that FRT can produce racially biased results if the underlying data used for the FRT is not representative of the community being served. [3] Researchers have also shown that machine learning can be fooled into identifying sexual minority men based on facial features which proved to actually be due to grooming characteristics in the cohort with no external validity. [4] While research on bias in FRT has been reported for over a decade, a NIST study in December of 2019 showed that algorithms continue to disproportionately misidentify Asian Americans and African Americans at a rate 10–100 times that of white Americans. [5]. As these technologies are implemented in healthcare, misidentification can lead to mistrust of the system, misdiagnosis, and failure to deliver on the promises of improved security and efficiency. If FRT is going to help clinicians develop a therapeutic alliance with patients, our patients need to be able to trust the systems being used.

Future of facial recognition in healthcare

As the COVID-19 pandemic continues to impact every nation on earth, wearing masks is likely to become a part of daily life for months to years ahead as we adapt to a new normal. All is not lost though! Advances in FRT will likely begin to overcome the challenges masks add to accurate facial recognition. As these technologies continue to evolve, we need to ensure that the technology is developed inclusive of all people regardless of race, sex, gender, sexual orientation, gender identity, age, origin, or background. Many of these technologies are based off of machine learning algorithms that depend on the faces used to program them; we need to ensure that they continue to use a diverse background of people during development, testing, and use. When accessing your patient record using FRT, errors in recognition can lead to security concerns and potentially inaccurate collection of patient data from the wrong patients. If done correctly, however, FRT has the possibility to improve security, efficiency, and the user experience in our healthcare system.

Data availability

N/A

Code availability

N/A

References

  1. 1.

    Park, E., et. al., Periocular Biometrics in the Visible Spectrum. IEEE Transactions on Information Forensics and Security, 2011. Accessed: 10.1.1.480.7646&rep=rep1&type=pdf.

  2. 2.

    Real-World Masked Face Dataset. Open Source. Accessed: https://github.com/X-zhangyang/Real-World-Masked-Face-Dataset.

  3. 3.

    Snipit in Biometric Technology News Today was taken from: Facial-Reognition Software Might Have a Racial Bias Problem. The Atlantic. https://www.theatlantic.com/technology/archive/2016/04/the-underlying-bias-of-facial-recognition-systems/476991/.

  4. 4.

    Agüera y Arcas, B., Todorov, A., Mitchell, M., Do algorithms reveal sexual orientation or just expose our stereotypes? Medium. January 11, 2018.

  5. 5.

    Grother, P., Ngan, M., Hanaoka, K., Face Recognition Vendor Test (FRVT). NIST. NISTIR 8280. 2019. Accessed: https://www.nist.gov/news-events/news/2019/12/nist-study-evaluates-effects-race-age-sex-face-recognition-software.

Download references

Author information

Affiliations

Authors

Contributions

Reviewed and approriate.

Corresponding author

Correspondence to Christopher Libby.

Ethics declarations

Conflicts of interest/competing interests

The authors whose names are listed on the title below certify that they have NO affiliations with or involvement in any organization or entity with any financial interest (such as honoraria; educational grants; participation in speakers’ bureaus; membership, employment, consultancies, stock ownership, or other equity interest; and expert testimony or patent-licensing arrangements), or non-financial interest (such as personal or professional relationships, affiliations, knowledge or beliefs) in the subject matter or materials discussed in this manuscript.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article is part of the Topical Collection on Image & Signal Processing

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Libby, C., Ehrenfeld, J. Facial Recognition Technology in 2021: Masks, Bias, and the Future of Healthcare. J Med Syst 45, 39 (2021). https://doi.org/10.1007/s10916-021-01723-w

Download citation