Under Observation: Artificial Intelligence, Facial Recognition and Biases

  • Tomás Balmaceda
    Universidad de Buenos Aires, / IIF/ SADAF – CONICET tomasbalmaceda[at]gmail.com
  • Tobías Schleider
    UNS-UNMDP-ILSED
  • Karina Pedace
    UBA/UNLaM/ IIF SADAF – CONICET

Abstract

In this paper we concentrate on facial recognition technology and emphasize that in recent decades, the idea that this type of technology is “objective” and devoid of human errors and biases has been installed in public discourse, assigning it any responsibility to the “good or bad use” that is made of it. Faced with this rhetoric, we argue that when it comes to talking about technological artifacts, neutrality is impossible, even in the very instance of design. To this end, based on the ideas of North American pragmatism, we analyze a real case to show that it is not possible to sustain a dichotomy between facts and values, and that it is necessary to recognize that there is no sharp distinction between the design of technology and the use that we make of it.
  • Referencias
  • Cómo citar
  • Del mismo autor
  • Métricas
ADC (2020). Con mi cara no. Asociación por los Derechos Civiles. http://conmicarano.adc.org.ar/

Baratta, Alessandro (2003). Principios de derecho penal mínimo. En Alessandro Baratta, Criminología y Sistema Penal (compilación in memoriam) (pp. 299-333). Buenos Aires: B de F.

Bentham, Jeremy (1791). Panopticon, or the Inspection House. London: T. Payne.

Binder, Alberto (2009). El control de la criminalidad en una sociedad democrática. En Gabriel Kessler (comp.), Seguridad y Ciudadanía. Nuevos paradigmas, reforma policial y políticas innovadoras (pp. 25-54). Buenos Aires: Edhasa.

Caplow, Theodore, Bahr, Howard M., Modell, John y Chadwick, Bruce A. (1994). Recent Social Trends in the United States 1960-1990. Montreal: McGill-Queen’s University Press.

Cognitec (2020). Cognitec supports fighting crime and curtailing human bias with face recognition technologies. https://www.cognitec.com/news-reader/fighting-crime-and-curtailing-human-bias-with-face-recognition.html

Debord, Guy (1967). La société du spectacle. Paris: Buchet.

Deleuze, Gilles (1992). Postscript on the Societies of Control. October, 59, 3-7.

Eubanks, Virginia (2018). Automating inequality: How High-Tech Tools Profile, Police and Punish the Poor. New York: St. Martin’s Press. https://doi.org/10.1080/10999922.2018.1511671

Feenberg, Andrew (2002). Transforming technology. A critical theory revisited. USA: Oxford University Press.

Feinstein, Dianne (2001). Biometrics Identifiers and the Modern Face of Terror: New Technologies in the Global War on Terrorism — Open Statement, 14 de noviembre de 2001. Washington, DC: U. S. Government Printing Office. https://www.govinfo.gov/content/pkg/CHRG-107shrg81678/html/CHRG-107shrg81678.htm

Friedman, Batya y Nissenbaum, Helen (1996). Bias in computer systems. ACM Transactions on Information Systems (TOIS), 14(3), 330-347. https://doi.org/10.1145/230538.230561

Fossi, Connie y Prazan Phil (2020). Miami Police Used Facial Recognition Technology in Protester’s Arrest. https://www.nbcmiami.com/investigations/miami-police-used-facial-recognition-technology-in-protesters-arrest/2278848/

Foucault, Michel (2013). La société punitive: Cours au Collège de France, 1972-1973, ed. Bernard E. Harcourt. Paris: Gallimard - Le Seuil.

Gates, Kelly A. (2011). Our Biometric Future: Facial Recognition Technology and the Culture of Surveillance. New York: NYU Press. https://doi.org/10.18574/9780814733035

GIFT (2020). Caja de herramientas humanísticas. https://grupo.gift

Haggerty, Kevin D. y Ericson, Richard V. (2000). The Surveillant Assemblage. British Journal of Sociology, 51(4), 605-622. https://doi.org/10.1080/00071310020015280

Han, Byung-Chul (2015). The Transparent Society. Stanford: Stanford University Press.

Haraway, Donna (1991). A Cyborg Manifesto: Science, Technology, and Socialist-Feminism in the Late Twentieth Century. En Simians, Cyborgs and Women: The Reinvention of Nature (pp. 149-181). New York: Routledge.

Harcourt, Bernard E. (2007). Against Prediction. Profiling, Policing, and Punishing in an Actuarial Age. Chicago: University of Chicago Press. https://doi.org/10.7208/chicago/9780226315997.001.0001

Harcourt, Bernard E. (2011): The Illusion of Free Markets. Punishment and the Myth of the Natural Order. Cambridge: Harvard University Press. https://10.2307/j.ctvjhzpv2

Harcourt, Bernard E. (2015): Exposed. Desire and Disobedience in the Digital Age. Cambridge: Harvard University Press. https://doi.org/10.4159/9780674915077

Hua, Gang (2015). Probabilistic Elastic Part Model for Real-World Face Recognition. En Ji, Qiang et ál (eds.), Face and Facial Expression Recognition from Real World Videos. FFER 2014. Lecture Notes in Computer Science, vol. 8912, 3-10. Cham: Springer. https://doi.org/10.1007/978-3-319-13737-7_1

Israel, Steve A. e Irvine, John (2012). Heartbeat biometrics: a sensing system perspective. Int. J. Cognitive Biometrics, 1(1), 39-65. https://doi.org/10.1504/IJCB.2012.046514

Johnson, Jeffrey A. (2006). Technology and pragmatism: From value neutrality to value criticality. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.2154654

Kafka, Franz (1925). Der Process. Berlin: Verlag Die Schmiede.

Latour, Bruno y Woolgar, Steve (1979). Laboratory life. The social construction of scientific facts. Beverly Hills, California, Sage Publications.

Li, Stan Z. y Jain, Anil K. (eds.) (2004). Introduction. En Handbook of Face Recognition (pp. 1-18). London: Springer Verlag. https://doi.org/10.1007/978-0-85729-932-1_1

Medina, R., III, et al. (2019). Authentication based on heartbeat detection and facial recognition in video data. United States Patent 10,268,916 B1. https://patentimages.storage.googleapis.com/a8/12/73/ca411b4b82e4e4/US10268910.pdf

Nait-ali, Amine (ed.) (2020). Hidden Biometrics. When Biometrics Security Meets Biomedical Engineering, Singapur: Springer. https://doi.org/10.1007/978-981-13-0956-4

Newell Sue y Marabelli, Marco (2015). Strategic opportunities (and challenges) of algorithmic decision-making: A call for action on the long-term societal effects of ‘datification’. The Journal of Strategic Information Systems, 24, 3-14. https://doi.org/10.1016/j.jsis.2015.02.001

Online Etymology Dictionary (2020). Entrada “mug”. https://www.etymonline.com/word/mug

O’Malley, Pat (1992). Risk, Power and Crime Prevention. Economy and Society, 21(3), 252-275. https://doi.org/10.1080/03085149200000013

Orwell, George (1949). 1984. Londres: Secker.

Petersen, Julie K (2001). Understanding Survilleance Technologies: Spy Devices, their origins & applications. Boca Raton: CRC Press.

PNUD (1995). Informe sobre Desarrollo Humano 1994. Washington: Programa de las Naciones Unidas para el Desarrollo.

Putnam, Hilary (2002). The Collapse of the Fact/Value Dichotomy and Other Essays. New York: Harvard University Press. https://doi.org/10.1201/978142003881

Ramli, Dzati A., Hooi, Man Y. y Chee, Kai J. (2016). Development of Heartbeat Detection Kit for Biometric Authentication System. Procedia Computer Science, 96, 305-314. https://doi.org/10.1016/j.procs.2016.08.143

Rhodes, Henry T. F. (1956). Alphonse Bertillon: Father of Scientific Detection. New York: Abelard-Schuman.

Rodríguez, Alejandra (2019). Facial recognition: A two-sided story. https://www.belatrixsf.com/blog/facial-recognition

Vaidhyanathan, Siva (2011). The Googlization of Everything —and Why We Should Worry. Berkeley: University of California Press. https://doi.org/10.1525/9780520948693

Vucetich, Juan (1904). Dactiloscopía comparada, el nuevo sistema argentino: trabajo hecho expresamente para el 2do. Congreso Médico Latino-americano, Buenos Aires, 3-10 de abril de 1904. Buenos Aires: Peuser.
Balmaceda, T., Schleider, T., & Pedace, K. (2021). Under Observation: Artificial Intelligence, Facial Recognition and Biases. ArtefaCToS. Revista De Estudios Sobre La Ciencia Y La tecnología, 10(2), 21–43. https://doi.org/10.14201/art20211022143

Downloads

Download data is not yet available.
+