Font Size: a A A

What's in Your Face? Discrimination in Facial Recognition Technolog

Posted on:2019-06-07Degree:M.AType:Thesis
University:Georgetown UniversityCandidate:Wang, JieshuFull Text:PDF
GTID:2448390005971892Subject:Information Technology
Abstract/Summary:
This paper examines the discrimination in facial recognition technology (FRT) and how to mitigate it in the contexts of academia, product development, and industrial research. FRT is the automation of the processing of human faces. In recent years, given the fast development of machine learning techniques, FRT gained considerable momentum. FRT is increasingly trained on extraordinarily large datasets and sophisticated algorithms, and its accuracy has been increased to the point that surpasses human capacity. Applications of FRT emerge in a variety of fields, such as surveillance, military, security, and e-commerce. At the same time, many ethical issues have been raised. In this paper, two types of FRT applications are distinguished---identification and classification. The former aims to search and match the captured face in the target database to pinpoint the identity, while the latter classifies people into different groups according to some properties drawn from their facial features, for example, gender, race, age, and sexual orientation. The latter type raises serious discrimination issues, because the training data is inherently biased, and it could be easily used to develop discriminatory applications and increase the number of people who suffer from discrimination. In order to mitigate the discrimination issue, three types of FRT design practices are identified---product development, academic research, and industrial research. Value Sensitive Design (VSD) is a helpful approach to minimize discriminatory issues in product development. In academic settings, the traditional way to ensure ethical outcomes is through institutional review boards (IRB), but IRB has many disadvantages when dealing with FRT and data science in general. In industrial research, Facebook's ethical review system developed after the "emotion contagion" study is discussed as a case study to demonstrate general principles that could help private companies in the FRT field to mitigate discrimination issues in research, such as ethical training and building multidisciplinary reviewing teams.
Keywords/Search Tags:FRT, Discrimination, Facial, Mitigate, Ethical, Issues
Related items