Date of Publication

3-19-2024

Document Type

Dissertation

Degree Name

Doctor of Philosophy in Philosophy

Subject Categories

Philosophy

College

College of Liberal Arts

Department/Unit

Philosophy

Thesis Advisor

Jeane C. Peracullo

Defense Panel Chair

Napoleon M. Mabaquiao, Jr.

Defense Panel Member

Hazel T. Biana
Robert James M. Boyles
Noelle Leslie G. dela Cruz
Beverly A. Sarza

Abstract/Summary

Facial recognition technologies (FRT), widely used in various applications such as identity verification, surveillance, and access control, often exhibit algorithmic bias, resulting in inaccurate identification of women, people of color, and gender-nonconforming individuals. The prevalence of such biases raises concerns about the fairness of facial recognition systems. Current AI fairness methods have attempted to address these issues through an intersectional framework, but the problem of auto-essentialization adds a critical dimension to the bias problem in FRT. Auto-essentialization refers to perpetuating racial and gender inequalities through automated technologies rooted in historical and colonial notions of difference, particularly concerning racialized gender. This historical and deep-seated bias challenges AI fairness methods’ dominant “de-biasing” algorithm approach. The persistence of these historical systems of inequality, if not recognized and addressed, may lead to the perpetuation and reinforcement of bias in FRT. Therefore, the objective of this study is two-fold: firstly, to analyze the algorithmic bias in FRT as a process of auto-essentialization rooted in historical colonial projects of gendered and racialized classification; and secondly, to explore the epistemological and ethical limitations of the prevailing intersectional AI fairness approach in addressing racial and gender bias within FRT as an auto-essentialist tool. In pursuing these objectives, the study emphasizes the critical need for a reimagined intersectional AI fairness approach to mitigating bias in FRT. This approach incorporates decolonial feminist perspectives into existing AI fairness frameworks to comprehensively address the problem of auto-essentialization, which contributes to racial and gender bias in FRT.

Abstract Format

html

Language

English

Format

Electronic

Keywords

Face perception; Artificial intelligence; Decolonization; Feminist theory

Upload Full Text

wf_yes

Embargo Period

4-9-2025

Available for download on Wednesday, April 09, 2025

Share

COinS