Our interactions with online tools and social media platforms entangle us in complex big data and surveillance systems. Public spaces and even social interactions increasingly subject us to recording and facial recognition systems.[i] Private spaces, increasingly integrated into the Internet of Things (IoT) via smart devices,[ii] open our domestic lives to surveillance. DNA (Deoxyribonucleic acid) and other biometric identifiers are already used by some governments, and may someday be keys to accessing certain social and economic benefits.
These systems already help us make better sense, but not without some downsides. On one hand, they reveal screen addictions and provide personal health information, suggest fantastic TV shows, and help us find romantic partners or people with similar hobbies. But there are emerging concerns about the ways big data is used to generate actionable insights about us. These insights are used to sort us into groups[iii] based on our preferences, evaluate us against criteria,[iv] monitor us in the workplace,[v] spread disinformation,[vi] and offer targeted ads for products.[vii]
In the future, these systems could shape many aspects of our lives. They may determine access to things like education, insurance, employment, and dating opportunities, especially if they include biological data such as our DNA. They may also shape how institutions, corporations, and other people see and treat us. Enhanced surveillance and quantification could help track the spread of disease[viii] in future pandemics, reach equity targets,[ix] or support sustainability efforts.[x] On the other hand, they could intensify social divides by strengthening filter bubbles.[xi] Or they might provide the raw materials for new or more effective kinds of manipulation and control, including private sector “reputation systems” and marketing chatbots with insights on our emotional and physical states.
References
[i] Elizabeth Thompson, “Canada planning technological fixes to make crossing the border faster,” CBC, last modified Jan. 24, 2022, https://www.cbc.ca/news/politics/border-airports-technology-biometric-1.6323855.
[ii] Dan Greene, “The erosion of personal ownership,” Vox, last modified Apr. 21, 2021, https://www.vox.com/the-goods/22387601/smart-fridge-car-personal-ownership-internet-things.
[iii] Charlene Chu, Kathleen Leslie, Rune Nyrup, & Shehroz Khan, “Artificial Intelligence can discriminate on the basis of race and gender, and also age,” The Conversation, last modified Jan. 18, 2022, https://theconversation.com/artificial-intelligence-can-discriminate-on-the-basis-of-race-and-gender-and-also-age-173617.
[iv] A. Fisher, “An Algorithm May Decide Your Next Pay Raise,” Fortune, last modified July 14, 2019, https://fortune.com/2019/07/14/artificial-intelligence-workplace-ibm-annual-review/.
[v] Matthew Finnegan, “Rise in employee monitoring prompts calls for new rules to protect workers,” Computerworld, last modified Nov. 30, 2021, https://www.computerworld.com/article/3642712/rise-in-employee-monitoring-prompts-calls-for-new-rules-to-protect-workers.html. Clive Thompson, “What AI exam proctors are really teaching our Children,” Wired, last modified Oct. 20, 2020, https://www.wired.com/story/ai-college-exam-proctors-surveillance/.
[vi] “How is Adtech funding misogynistic disinformation?” Global Disinformation Index, last modified Jan. 4, 2023, https://www.disinformationindex.org/disinfo-ads/2023-01-04-how-is-ad-tech-funding-misogynistic-disinformation/.
[vii] “‘Surveillance capitalism’ by tech giants is a growing threat, says privacy watchdog,” Global News, last modified Dec. 9, 2021, https://globalnews.ca/news/8436973/privacy-personal-information-surveillance-capitalism-watchdog-report/.
[viii] “BC professor to lead network aimed at helping Canada prepare for future pandemics,” CTV, last modified Apr. 12, 2021, https://bc.ctvnews.ca/b-c-professor-to-lead-network-aimed-at-helping-canada-prepare-for-future-pandemics-1.5383845.
[ix] K. Hanegan, “How Data Can Supercharge Your DEI Program,” InformationWeek, last modified Dec. 26, 2022, https://www.informationweek.com/big-data/how-data-can-supercharge-your-dei-program.
[x] J. Lou, N. Hultman, A Patwardahan, and Y. Qui, “Integrating sustainability into climate finance by quantifying the co-benefits and market impact of carbon projects,” Communications and Earth Environment July no. 3 (2022) https://www.nature.com/articles/s43247-022-00468-9.
[xi] A filter bubble is the “state of intellectual isolation” that results when search engine algorithms feed you a narrow range of information based on data it has about your location, search history, and previous clicks. “Filter Bubble,” Wikipedia, accessed Aug. 5, 2022, https://en.wikipedia.org/wiki/Filter_bubble.