A.I. Among Us: Agency in a World of Cameras and Recognition Systems

Share Share Share Share Share
[s2If !is_user_logged_in()] [/s2If] [s2If is_user_logged_in()]

REFERENCES CITED

Bates, J, Y-W Lin, and P. Goodale. 2016. Data Journeys: Capturing the Socio-Material Constitution of Data Objects and Flows. Big Data & Society 3(2): DOI: 10.1177/2053951716654502.

Bennett, Jane. 2009. Vibrant Matter: A Political Ecology of Things, Durham: Duke University Press.

Besky, Sarah and Alex Blanchette. 2019. How Nature Works: Rethinking Labor on a Troubled Planet. Ann Arbor, MI: University of Michigan Press.

Blackman, James. 2019. Surveillance Cams to Take 70% of 5G IoT in 2020. Enterprise IOT Insights website, Oct. Accessed October 24, 2019. https://enterpriseiotinsights.com/20191024/connected-cars-2/surveillance-cams-and-c-v2x-to-take-70-of-5g-iot-share-says-gartner?utm_campaign=20191024%20Enterprise%20IoT%20NewsletterThurs&utm_medium=email&utm_source=Eloqua

Brewster, Thomas. 2019 London Police Facial Recognition ‘Fails 80% Of The Time And Must Stop Now.’ Forbes website, July 4. Accessed July 31, 2019. https://www.forbes.com/sites/thomasbrewster/2019/07/04/london-police-facial-recognition-fails-80-of-the-time-and-must-stop-now/#54fdcdf0bf95

Burrell, Jenna. 2016. How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data and Society website, Jan. Accessed May 15, 2019. https://journals.sagepub.com/doi/full/10.1177/2053951715622512

Business Times. 2019. China Shoppers Adopt Facial Recognition Payment Technology. Business Times website, Sept 5. Accessed Sept 5, 2019. https://www.businesstimes.com.sg/technology/china-shoppers-adopt-facial-recognition-payment-technology

Crawford, Kate and Jason Schultz. 2013. Big Data and Due Process: Toward a Framework to Redress Predictive Privacy Harms. Boston College Law Review 55, 93-128.

Caronia, Letizia and Luigina Mortari. 2015. The Agency of Things: How Spaces and Artefacts Organize the Moral Order of an Intensive Care Unit. Social Semiotics, 25 (4): 401-422.

Collier, Stephen and Aihwa Ong (eds.) 2008. Global Assemblages: Technology, Politics, and Ethics as Anthropological Problems. Malden, MA: Blackwell.

Deluze, Gilles and Félix Guattari (Translation and Introduction by Brian Massumi). 1987. A Thousand Plateaus: Capitalism and Schizophrenia. Minneapolis: University of Minnesota Press.

Dwork, Cynthia & Deirdre K. Mulligan. 2013. It’s Not Privacy, and It’s Not Fair, 66 Stanford Law Review. 35:36–37.

Elish, Madeleine Clare. 2018. “The Stakes of Uncertainty: Developing and Integrating Machine Learning in Clinical Care.” 2018 Ethnographic Praxis in Industry Conference Proceedings, pp. 364–380. https://www.epicpeople.org/machine-learning-clinical-care/

Einhorn, Erin. 2019. A Fight Over Facial Fecognition is Dividing Detroit — with High Stakes for Police and Privacy. NBC News website, August 22. Accessed October 1, 2019. https://www.nbcnews.com/news/us-news/fight-over-facial-recognition-dividing-detroit-high-stakes-police-privacy-n1045046.

Ellis, Rebecca. 2019. Portland Considers Banning Use Of Facial Recognition Software In Private Sector. OPB News website, Sept 17. Accessed Sept 17, 2019. https://www.opb.org/news/article/portland-facial-recognition-software-private-sector-use-ban/

Eubanks, Virginia. 2018. Automating Inequality: How High-Tech Tools Profile Police & Punish the Poor. St : New York: Martin’s Press.

Foucault, Michel (Alan Sheridan translation). 1995. Discipline and Punish: The Birth of the Prison. New York: Vintage Books.

Garvie, Claire. 2019. Garbage In, Garbage Out. Georgetown Law’s Center on Privacy and Technology website, May 16. Accessed July 10, 2019 https://www.flawedfacedata.com

Gibson, William. 1999. The Science in Science Fiction. Talk of the Nation, NPR, November 30, Timecode 11:55. Accessed on July 9th, 2019 https://www.npr.org/2018/10/22/1067220/the-science-in-science-fiction

The Guardian. 2019. The Guardian View on Facial Recognition: a Danger to Democracy. The Guardian website, June 9. Accessed June 9, 2019. https://www.theguardian.com/commentisfree/2019/jun/09/the-guardian-view-on-facial-recognition-a-danger-to-democracy

Harwell, Drew. 2019. Facial-recognition Use by Federal Agencies Draws Lawmakers’ Anger. Washington Post website, July 9th. Accessed July 9, 2019. https://www.washingtonpost.com/technology/2019/07/09/facial-recognition-use-by-federal-agencies-draws-lawmakers-anger/

Horvitz, Eric and Deirdre Mulligan. 2015. Data, Privacy, and the Greater Good. Science, 349 (6245):253-255.

Introna, Lucas and Helen Nissenbaum. 2009. “Facial Recognition Technology: A Survey of Policy and Implementation Issues,” Report of the Center for Catastrophe Preparedness and Response, NYU.

Latour, Bruno. 2005. Re-assembling the Social. New York: Oxford University Press.

Kohn, Eduardo. 2013. How Forests Think: Toward an Anthropology Beyond the Human. Berkeley, CA: University of California Press.

Metz, Rachel. 2019. Amazon Wins Facial-Recognition Vote, but Concerns about the Tech Aren’t Going Away. CNN website, May 22. Accessed July 19, 2019. https://www.cnn.com/2019/05/22/tech/amazon-facial-recognition-vote/index.html

Milgram, Stanley. 1972. “The Familiar Stranger: An Aspect of Urban Anonymity”. In The Division 8 Newsletter, Division of Personality and Social Psychology. Washington: American Psychological Association

Mintz, Sidney. 1985. Sweetness and Power: The Place of Sugar in Modern History. New York:Viking-Penguin.

Newman, Lily Hay. 2019. Facial Recognition Has Already Reached Its Breaking Point. Wired website, May 22. Accessed May 29, 2019 https://www.wired.com/story/facial-recognition-regulation/

Noble, Safiya. 2016. Algorithms of Oppression: Race, Gender and Power in the Digital Age. New York: NYU Press.

O’Neil, Kathy. 2016. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. New York:Crown Publishing.

Pasquale, Frank. 2016. The Black Box Society: The Secret Algorithms That Control Money and Information. Boston: Harvard University Press.

Raji, Inioluwa Deborah & Joy Buolamwini. 2019. “Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial A.I. Products”. Conference on Artificial Intelligence, Ethics, and Society.

Ravani, Sarah. 2019. Oakland Bans Use of Facial Recognition Technology, Citing Bias Concerns. San Francisco Chronicle website, July 17. Accessed July 17, 2019. https://www.sfchronicle.com/bayarea/article/Oakland-bans-use-of-facial-recognition-14101253.php

Seaver, Nick. 2017. Algorithms as Culture: Some Tactics for the Ethnography of Algorithmic Systems. Big Data and Society website, Nov. Accessed May 15, 2019. https://doi.org/10.1177/2053951717738104

Stat, Nick. 2019. “Orlando Police Once Again Ditch Amazon’s Facial Recognition Software.” The Verge, July 18. Last Accessed August 1, 2019. https://www.theverge.com/2019/7/18/20700072/amazon-rekognition-pilot-program-orlando-florida-law-enforcement-ended

Stayton, Erik, Melissa Cefkin and Jingyi Zhang. 2017. “Autonomous Individuals in Autonomous Vehicles: The Multiple Autonomies of Self-Driving Cars.” 2017 Ethnographic Praxis in Industry Conference Proceedings. https://www.epicpeople.org/autonomous-individuals-autonomous-vehicles/

Stayton, Erik, and Melissa Cefkin. 2018. “Designed for Care: Systems of Care and Accountability in the Work of Mobility.” 2018 Ethnographic Praxis in Industry Conference Proceedings, pp. 334–350, ISSN 1559-8918. https://www.epicpeople.org/care-accountability-work-mobility/

Teicher, Jordan. 2019. What Do Facial Recognition Technologies Mean for Our Privacy? New York Times website July 18. Accessed July 18, 2019. https://www.nytimes.com/2018/07/18/lens/what-do-facial-recognition-technologies-mean-for-our-privacy.html

Thadani, 2019. San Francisco Bans City Use of Facial Recognition Surveillance Technology. San Francisco Chronicle website, May 14. Accessed July 5, 2019. https://www.sfchronicle.com/politics/article/San-Francisco-bans-city-use-of-facial-recognition-13845370.php

Tsing, Anna Lowenhaup. 2015. The Mushroom at the End of the World: On the Possibility of Life in Capitalist Ruins. Princeton, NJ: Princeton University Press

Trump, Donald. 2017. Executive Order 13769: Protecting the Nation from Foreign Terrorist Entry into the United States. March 6, 2017. https://www.whitehouse.gov/presidential-actions/executive-order-protecting-nation-foreign-terrorist-entry-united-states-2/

Wang, Cunrui, Qingling Zhang, Wanquan Liu, Yu Liu and Lixin Miao. 2018. Facial Feature Discovery for Ethnicity Recognition. WIREs: Data Mining and Knowledge Discovery. Wiley. July 2018. https://doi.org/10.1002/widm.1278

White, Goeff. 2019. Use of Facial Recognition Tech ‘Dangerously Irresponsible’. BBC News website, May 13. Accessed July 10, 2019 https://www.bbc.com/news/technology-48222017

NOTES

Acknowledgments – We’d like to thank all the people who gave us time out of their busy day to share their thoughts, stories and experiences with us. We’d also like to thank the institutions that opened doors for us and welcomed us into their communities. Finally, we’d like to thank Ellie Rennie, our EPIC curator, for her truly helpful comments and support.

1. We will draw upon research primarily from China with some comparative or contrastive sites in the USA. Pseudonyms are used throughout this paper. The research in China was conducted in 2018. We spent two weeks surveying recognition programs in public use in Beijing, Shanghai and Hangzhou. Primarily these were one on one around particular recognition programs, e.g., access to banking, access to work, smiling to pay, etc. While trying to understand how the systems were used (and others they used), we also explored the broader context of their lives. We returned 6 months later and spent 10 days to do deeper dives around recognition systems in educational institutions. We primarily focused on 3 high schools: 2 public and 1 private. The schools discussed in this paper are both public schools. One school was one of the poorer ones in the district, while the other was situated in a university community. All the recognition systems discussed were not yet commercial systems. At the schools, we interviewed a variety of stakeholders: teachers, administrators, staff, students and parents. Independent from the interviews at schools, we talked to representatives of some of the companies that provided the systems to the schools. The school administration asked that their schools names not be used in any report. Likewise, all the participants in the research have been anonymized. None of the systems created for the schools in China were products or services at the time we did our research – they were experiments. High School Z uses a team of parents, teachers, staff and administration to brain storm uses for new applications that they want to bring onto campus. The administrator and IT lead try to find (large or small) companies interested in creating the system for the school, creating public and private partnerships. The public schools in China, in general, when we were in doing the research, had no guidance for systems to build, buy or deploy – everything was an experiment. The research in the USA was primarily site visits. We visited the sheriff’s department in May of 2018 and the St Nicholas school in March of 2019. The facial recognition software used by St Nicholas is a commercial product. The former was done as a part of the exploration of landscape of uses of facial recognition. The later was conducted as a point of comparison to what we had seen in China.

2. When we were in China, the stories about facial recognition systems being used on the Uyghurs had not become content of mainstream media in the USA or China. The stories of facial recognition that were circulating were about people being ticketed for minor offenses (e.g., jay walking), dispensing toilet paper, and criminals being identified and/or caught on the street (or at events), authenticating appropriate car service drivers and so on. The camera surveillance system was primarily explained in terms of safety and civic etiquette, reinforcing the way people were to behave, protecting against those who violate etiquette and laws. No one we talked to wanted to see less recognition systems in place, most had ideas of where they wanted to see more, e.g., “ticket dog poopers who aren’t scoopers” “find my child” “reward appropriate behavior in Starbucks (throwing trash away).”

3. As mentioned, the recognition systems in schools should be considered experiments. The affect system was an experiment to create a better classroom experience for learning. For those in the USA, the in-school experience is a little different, particularly when looking at something like affect detection. The value of the student is judged more on how he/she/they perform on the national exams then on grades in school. Every class I saw, someone slept during class. The reason given was they had been studying non-class material for the national exam until late in the night and were tired. All students and parents talked about the use of materials from outside of the school work to help them with the national exam. The import of the exam vs the school plays out in the various systems in that the evaluation of the system about the student (attentive or not) does not really impact the student as much as such a system might in the USA. Of course, everyone wants to score well on everything, however, whereas a grade in a course might greatly affect a student’s future in the USA, the national exam would affect a student’s future in China. HS X, in part, was using the affect system to try to create a more dynamic learning environment for everyone, in the hopes of improve the overall performance on national exams from their students.

2019 EPIC Proceedings, ISSN 1559-8918, https://www.epicpeople.org/epic

[/s2If]

Pages: 1 2 3 4

Leave a Reply